Cross entropy loss in pytorch nn.CrossEntropyLoss()
By : Александр Яковлв
Date : March 29 2020, 07:55 AM
hope this fix your issue maybe someone is able to help me here. I am trying to compute the cross entropy loss of a given output of my network , Please check this code code :
import torch
import torch.nn as nn
from torch.autograd import Variable
output = Variable(torch.rand(1,10))
target = Variable(torch.LongTensor([1]))
criterion = nn.CrossEntropyLoss()
loss = criterion(output, target)
print(loss)
Variable containing:
2.4498
[torch.FloatTensor of size 1]

Cross Entropy in PyTorch
By : Leonardo A. Garza
Date : March 29 2020, 07:55 AM
hope this fix your issue In your example you are treating output [0,0,0,1] as probabilities as required by the mathematical definition of cross entropy. But PyTorch treats them as outputs, that don’t need to sum to 1, and need to be first converted into probabilities for which it uses the softmax function. So H(p,q) becomes:

Custom crossentropy loss in pytorch
By : user2892869
Date : March 29 2020, 07:55 AM
will be helpful for those in need If you need just cross entropy you can take the advantage PyTorch defined that. code :
import torch.nn.functional as F
loss_func = F.cross_entropy
def log_softmax(x):
return x  x.exp().sum(1).log().unsqueeze(1)
nll_loss(log_softmax(input, 1), target, weight, None, ignore_index, None, reduction)

Semantic Segmentation — Usage of Categorical CrossEntropy in spite of Binary CrossEntropy for Binary Image Segmentatio
By : user3417310
Date : March 29 2020, 07:55 AM

How do I calculate crossentropy from probabilities in PyTorch?
By : user3675524
Date : March 29 2020, 07:55 AM
Hope that helps There is a reduction parameter for all loss functions in the PyTorch. As you can see from the documentation default reduction parameter is 'mean' which divides the sum with number of elements in the batch. To get a summation behavior (0.4338) as you want, you should give the reduction parameter as following: code :
F.nll_loss(torch.log(probs), labels,reduction='sum')

