使用pytorch计算交叉熵损失时输出错误



使用pytorch计算交叉熵损失时输出错误

大家好!我使用pytorch计算交叉熵损失,Input = torch.tensor([[1.,0.0,0.0],[1.,0.0,0.0]]),label = torch。张量([0,0])。输出必须是0,但我得到(张量(0.5514))。谁能告诉我为什么是0.55,而不是0代码供参考?

是的,您正在得到正确的输出。

import torch
Input = torch.tensor([[1.,0.0,0.0],[1.,0.0,0.0]])
label = torch.tensor([0, 0])
print(torch.nn.functional.cross_entropy(Input,label))
# tensor(0.5514)

torch.nn.functional.cross_entropy函数将log_softmaxnll_loss合并为一个函数:

相当于:

torch.nn.functional.nll_loss(torch.nn.functional.log_softmax(Input, 1), label)
参考代码:
print(torch.nn.functional.softmax(Input, 1).log())
tensor([[-0.5514, -1.5514, -1.5514],
[-0.5514, -1.5514, -1.5514]])
print(torch.nn.functional.log_softmax(Input, 1))
tensor([[-0.5514, -1.5514, -1.5514],
[-0.5514, -1.5514, -1.5514]])
print(torch.nn.functional.nll_loss(torch.nn.functional.log_softmax(Input, 1), label))
tensor(0.5514)

现在,你看到:

torch.nn.functional.cross_entropy(Input,label)=

torch.nn.functional.nll_loss(torch.nn.functional.log_softmax(Input, 1), label)

相关内容

最新更新