Randomly Set Some Elements In A Tensor To Zero (with Low Computational Time)
I have a tensor of shape (3072,1000) which represents the weights in my neural network. I want to: randomly set 60% of its elements to zero. After updating the weights, keep 60% o
Solution 1:
You can use the dropout
function for this:
import torch.nn.functional as F
my_tensor.weight = F.dropout(my_tensor.weight, p=0.6)
Solution 2:
iacob's answer is perfect if you want approximately 60% of the weights to be set to 0. If you want to set exactly m
values in your tensor to zero then you can use something like this
n = mytensor.weight.numel()
m = int(round(n*0.6))
indices = np.random.choice(n, m, replace=False) # alternative: indices = torch.randperm(n)[:m]
mytensor.weight = mytensor.weight.contiguous()
mytensor.weight.flatten()[indices] = 0
Post a Comment for "Randomly Set Some Elements In A Tensor To Zero (with Low Computational Time)"