keras可以在自定义度量中使用sklearn来创建micro f1_score



我在stackoverflow 中找到了一个版本

from keras import backend as K
def f1(y_true, y_pred):
def recall(y_true, y_pred):
"""Recall metric.
Only computes a batch-wise average of recall.
Computes the recall, a metric for multi-label classification of
how many relevant items are selected.
"""
true_positives = K.sum(K.round(K.clip(y_true * y_pred, 0, 1)))
possible_positives = K.sum(K.round(K.clip(y_true, 0, 1)))
recall = true_positives / (possible_positives + K.epsilon())
return recall
def precision(y_true, y_pred):
"""Precision metric.
Only computes a batch-wise average of precision.
Computes the precision, a metric for multi-label classification of
how many selected items are relevant.
"""
true_positives = K.sum(K.round(K.clip(y_true * y_pred, 0, 1)))
predicted_positives = K.sum(K.round(K.clip(y_pred, 0, 1)))
precision = true_positives / (predicted_positives + K.epsilon())
return precision
precision = precision(y_true, y_pred)
recall = recall(y_true, y_pred)
return 2*((precision*recall)/(precision+recall+K.epsilon()))

model.compile(loss='binary_crossentropy',
optimizer= "adam",
metrics=[f1])

但是我可以在创建自定义度量时使用sklearn f1_score吗?我想使用f1_score宏和f1_scoremicro的平均值,有人能帮我吗?感谢

我认为您可以在每个批次的培训中使用上面显示的代码。因为它正在计算每个批次的F1分数,您可以在终端中看到打印的日志。


1/13[=>…………..]-ETA:4s-损耗:0.2646-f1:0.2927

2/13[===>…………..]-ETA:4s-损耗:0.2664-f1:0.1463

13/13【=========================】-7s 505ms/步-损耗:0.2615-f1:0.1008-val_loss:0.2887-val_f1:0.1464


如果您使用拟合方法并想计算每个历元的F1,您应该尝试如下代码。

class Metrics(Callback):
'''
Defined your personal callback
'''
def on_train_begin(self, logs={}):
self.val_f1s = []
self.val_recalls = []
self.val_precisions = []
def on_epoch_end(self, epoch, logs={}):
#         val_predict = (np.asarray(self.model.predict(self.validation_data[0]))).round()
val_predict = np.argmax(np.asarray(self.model.predict(self.validation_data[0])), axis=1)
#         val_targ = self.validation_data[1]
val_targ = np.argmax(self.validation_data[1], axis=1)
_val_f1 = f1_score(val_targ, val_predict, average='macro')
#         _val_recall = recall_score(val_targ, val_predict)
#         _val_precision = precision_score(val_targ, val_predict)
self.val_f1s.append(_val_f1)
#         self.val_recalls.append(_val_recall)
#         self.val_precisions.append(_val_precision)
#         print('— val_f1: %f — val_precision: %f — val_recall %f' %(_val_f1, _val_precision, _val_recall))
print(' — val_f1:', _val_f1)
return

使用回调拟合方法。

metrics = Metrics()
model.fit_generator(generator=generator_train,
steps_per_epoch=len(generator_train),
validation_data=generator_val,
validation_steps=len(generator_val),
epochs=epochs,
callbacks=[metrics])

有一些提示需要注意:如果使用fit_generator((方法进行训练,则只能使用显示的代码。另外,如果使用fit((方法,您可以尝试回调函数。

都到了!

最新更新