当调用scheduler.step()时,MultiplicativeLR调度器不能正常工作



PytorchLightning框架,我是这样配置优化器的:

def configure_optimizers(self):
opt = torch.optim.Adam(self.model.parameters(), lr=cfg.learning_rate)
#modified to fit lightning
sch = torch.optim.lr_scheduler.MultiplicativeLR(opt, lr_lambda = 0.95) #decrease of 5% every epoch

return [opt], [sch]

然后在training_step中,我可以手动调用lr_scheduler或让lightning自动调用。事实是,在任何情况下,我得到这样的错误:

lr_scheduler["scheduler"].step()
File "/home/lsa/anaconda3/envs/randla_36/lib/python3.6/site-packages/torch/optim/lr_scheduler.py", line 152, in step
values = self.get_lr()
File "/home/lsa/anaconda3/envs/randla_36/lib/python3.6/site-packages/torch/optim/lr_scheduler.py", line 329, in get_lr
for lmbda, group in zip(self.lr_lambdas, self.optimizer.param_groups)]
File "/home/lsa/anaconda3/envs/randla_36/lib/python3.6/site-packages/torch/optim/lr_scheduler.py", line 329, in <listcomp>
for lmbda, group in zip(self.lr_lambdas, self.optimizer.param_groups)]
TypeError: 'float' object is not callable

但是如果我使用任何其他调度器,不仅VSCode识别它属于pytorch,我也没有得到这个错误。

Pytorch version 1.10闪电1.5版

我认为你需要改变' lr_lambda'的值。这里是文档的链接:https://pytorch.org/docs/stable/generated/torch.optim.lr_scheduler.MultiplicativeLR.html

lr_lambda (function or list) – A function which computes a multiplicative factor given an integer parameter epoch, or a list of such functions, one for each group in optimizer.param_groups.

所以,如果你想要每个epoch减少5%,那么你可以这样做:


def configure_optimizers(self):
opt = torch.optim.Adam(self.model.parameters(), lr=cfg.learning_rate)
#modified to fit lightning
lmbda = lambda epoch: 0.95
sch = torch.optim.lr_scheduler.MultiplicativeLR(opt, lr_lambda = lmbda) #decrease of 5% every epoch

return [opt], [sch]