如何通过索引访问pytorch模型参数,然后更新特定层



如果我有一个5层的网络,我只想更新第一层。

目前,我正在做这样的事情。我设法获得了第一层的权重,更改了它们并返回了(仅第一层的(权重。但是,我无法将权重与原始myModel组合并更新。

for parameter in myModel.parameters():
layers=[x.data for x in myModel.parameters()]  
print(layers[0])  # get the first layer
... # code to change up the weights of the first layer.
# now how do I update the first layer of myModel?
---

型号类别定义:

class Actor(nn.Module):
def __init__(self, state_dim=s_dim, action_dim=1, action_lim=a_max):
super(Actor, self).__init__()
self.state_dim = state_dim
self.action_dim = action_dim
self.action_lim = torch.Tensor([action_lim])
self.fitness = 0
self.fc1 = nn.Linear(state_dim,256)
self.bn1 = nn.BatchNorm1d(1)
self.fc2 = nn.Linear(256,128)
#self.fc2.weight.data = fanin_init(self.fc2.weight.data.size())
self.bn2 = nn.BatchNorm1d(1)
self.fc3 = nn.Linear(128,64)
self.bn3 = nn.BatchNorm1d(1)
self.fc4 = nn.Linear(64,action_dim) 
def forward(self, state):
x = self.fc1(state)
x = F.relu(x)
x = self.fc2(x)
x = F.relu(x)
x = self.fc3(x)
x = F.relu(x)
action = T.tanh(self.fc4(x))
action = action * self.action_lim
return action

这个问题发生在我研究遗传算法的交叉运算时。这个想法是让两个参与者(基因(交叉第一层的权重。然后用两个更新的第一层(子层(更新原始的两个参与者。因此,试图在第一层的权重发生变化的情况下获得更新的演员模型。

for name, target_param in gene1.named_parameters():
layers1=[x.data for x in gene1.parameters()]
layers2=[x.data for x in gene2.parameters()]
y1 = layers1[0]
y2 = layers2[0]
try: num_cross_overs = random.randint(0, int(layers1[0].shape[0] * 0.3))  # Number of Crossovers
except: num_cross_overs = 1 
for i in range(num_cross_overs):
receiver_choice = random.random()  # Choose which gene to receive the perturbation, returns random floating number between 0 and 1.
if receiver_choice < 0.5:   # W1 chosen to receive the perturbation.
ind_cr = random.randint(0, y1.shape[0]-1)   # Only need to crossover the output weights of the layer. Choose a crossover point.
y1[ind_cr, :] = y2[ind_cr, :]   # Exchange a certain part of the weights, it only swaps a certain index of the list rather than all the index leading up to it. But rmr that there's multiple crossovers.
else:
ind_cr = random.randint(0, y1.shape[0]-1)  
y2[ind_cr, :] = y1[ind_cr, :]

您可以执行以下操作。根据上下文,我假设您不需要通过权重更改步骤进行反向传播。

y1 = gene1.fc1.weight
y2 = gene2.fc1.weight
try: num_cross_overs = random.randint(0, int(y1.shape[0] * 0.3))  # Number of Crossovers
except: num_cross_overs = 1 
for i in range(num_cross_overs):
receiver_choice = random.random()  # Choose which gene to receive the perturbation, returns random floating number between 0 and 1.
if receiver_choice < 0.5:   # W1 chosen to receive the perturbation.
ind_cr = random.randint(0, y1.shape[0]-1)   # Only need to crossover the output weights of the layer. Choose a crossover point.
y1[ind_cr, :].data = y2[ind_cr, :]   # Exchange a certain part of the weights, it only swaps a certain index of the list rather than all the index leading up to it. But rmr that there's multiple crossovers.
else:
ind_cr = random.randint(0, y1.shape[0]-1)  
y2[ind_cr, :].data = y1[ind_cr, :]

相关内容

  • 没有找到相关文章

最新更新