如何在Keras中使用fit_generator绘制steps_per_epoch以防止损失



我有以下代码,我想绘制losssteps_per_epoch的关系图

model = unet(pretrained=False)
model.compile(optimizer=Adam(0.005), loss="binary_crossentropy",
metrics=["accuracy"])
history = model.fit_generator(train_gen, steps_per_epoch=500, epochs=5,
callbacks=[dynamic_lr, chkp])

其中lrchkp是我对模型的回调:

def lr_scheduler(epoch, lr):
if epoch <= 2:
lr = 0.002
return lr
lr = 0.001  
return lr   
chkp = keras.callbacks.ModelCheckpoint(
filepath="mypath/model.hdf5",
monitor="loss",
verbose=1,
save_best_only=True,
mode="min",
)
dynamic_lr = LearningRateScheduler(lr_scheduler, verbose=1)  

我不认为historydict为epoch中的每一步都保存loss,但有什么方法吗?

您可以从历史对象中获得训练准确性、训练损失、验证准确性和验证损失的值。请参阅下面的代码。

training_accuracy=history.history['accuracy']
training_loss=history.history['loss']
valid_accuracy=history.history['val_accuracy']
valid_loss=history.history['val_loss']

最新更新