如何使用pytorch实现Softmax回归



我正在处理一个uni赋值,需要用Pytorch实现Softmax Regression。任务上写着:

Implement Softmax Regression as an nn.Module and pipe its output with its output with torch.nn.Softmax.

由于我是pytorch的新手,我不知道该怎么做。到目前为止,我已经尝试过:

类SoftmaxRegression(nn.Module(:#从nn继承。单元

def __init__(self, num_labels, num_features):
super(SoftmaxRegression, self).__init__()
self.linear = torch.nn.Linear(num_labels, num_features)
def forward(self, x):
# should return the probabilities for the classes, e.g.
# tensor([[ 0.1757,  0.3948,  0.4295],
#         [ 0.0777,  0.3502,  0.5721], 
#         ...
# not sure what to do here

有人知道我该怎么做吗?我不确定forward方法应该写什么。我感谢任何帮助!

据我所知,任务希望您实现自己版本的Softmax函数。但是,我不明白你说的and pipe its output with torch.nn.Softmax是什么意思。他们是否要求您返回自定义Softmax的输出以及自定义nn.Module中的torch.nn.Softmax?你可以这样做:

class SoftmaxRegression(nn.Module):
def __init__(self, dim=0):
super(SoftmaxRegression, self).__init__()
self.dim = dim
def forward(self, x):
means = torch.mean(x, self.dim, keepdim=True)[0]
exp_x= torch.exp(x-means)
sum_exp_x = torch.sum(exp_x, self.dim, keepdim=True)
value = exp_x/sum_exp_x
return value

最新更新