我有以下代码,我想绘制loss
与steps_per_epoch
的关系图
model = unet(pretrained=False)
model.compile(optimizer=Adam(0.005), loss="binary_crossentropy",
metrics=["accuracy"])
history = model.fit_generator(train_gen, steps_per_epoch=500, epochs=5,
callbacks=[dynamic_lr, chkp])
其中lr
和chkp
是我对模型的回调:
def lr_scheduler(epoch, lr):
if epoch <= 2:
lr = 0.002
return lr
lr = 0.001
return lr
chkp = keras.callbacks.ModelCheckpoint(
filepath="mypath/model.hdf5",
monitor="loss",
verbose=1,
save_best_only=True,
mode="min",
)
dynamic_lr = LearningRateScheduler(lr_scheduler, verbose=1)
我不认为history
dict为epoch中的每一步都保存loss
,但有什么方法吗?
您可以从历史对象中获得训练准确性、训练损失、验证准确性和验证损失的值。请参阅下面的代码。
training_accuracy=history.history['accuracy']
training_loss=history.history['loss']
valid_accuracy=history.history['val_accuracy']
valid_loss=history.history['val_loss']