无法获取RNN的隐藏权重值



我将我的RNN声明为

self.rnn = torch.nn.RNN(input_size=encoding_dim, hidden_size=1, num_layers=1, nonlinearity='relu')

后期

self.rnn.all_weights
# [[Parameter containing:
tensor([[-0.8099, -0.9543,  0.1117,  0.6221,  0.5034, -0.6766, -0.3360, -0.1700,
-0.9361, -0.3428]], requires_grad=True), Parameter containing:
tensor([[-0.1929]], requires_grad=True), Parameter containing:
tensor([0.7881], requires_grad=True), Parameter containing:
tensor([0.4320], requires_grad=True)]]
self.rnn.all_weights[0][0][0].values
# {RuntimeError}Could not run 'aten::values' with arguments from the 'CPU' backend. 'aten::values' is only available for these backends: [SparseCPU, Autograd, Profiler, Tracer].

很明显,我看到了权重的值,但无法访问它。文档中说我需要指定requires_grad=True,但这不起作用。

有没有比self.rnn.all_weights[0][0][0]?更优雅、更实用的方法

使用torch.nn.Module.named_parameterstorch.nn.Module.parameters

>>> import torch.nn as nn
>>> model = nn.RNN(input_size=encoding_dim, hidden_size=1, num_layers=1, nonlinearity='relu')
>>> weights = []
>>> for name, parameter in model.named_parameters():
...     weights.append({name: parameter[0]})
...
>>> just_weights = []
>>> for parameter in model.parameters():
...     just_weights.append(parameter[0])
...

最新更新