为深度学习 keras 中的损失函数添加中间层



我想在keras中使用自定义损失函数,使用中间层用于DNN模型VAE。我调用模型函数,然后将损失添加到层中。

错误是:

ValueError: An operation has `None` for gradient. Please make sure that all of your ops have a gradient defined (i.e. are differentiable). Common ops without gradient: K.argmax, K.round, K.eval

model被编译,但在训练期间导致错误

#Below is the code which causes error
#this returns the models (en,de, model) and layers (z_mean and #z_log_sigma)
en,de,model,z_mean,z_log_sigma = load_model(config)
#defining loss using intermediate layers returned
kl_loss = - 0.5 * K.mean(1 + z_log_sigma - K.square(z_mean) - 
K.exp(z_log_sigma), axis=-1)
model.add_loss(kl_loss)
model.compile( optimizer=optimizer)
#error is raised during training 
history = model.fit_generator(
genfun,
steps_per_epoch = display_interval,
epochs = 1,
shuffle=False,
verbose = 1
) #callbacks=[eval_map])    

该解决方案适用于设计自定义损失层。

en,de,model,z_mean,z_log_sigma = load_model(config)
def custom_loss_wrapper(z_mean=z_mean,z_log_sigma=z_log_sigma):
def loss(y_true, y_pred):
xent_loss = binary_crossentropy(y_true, y_pred)
kl_loss = - 0.5 * K.mean(1 + z_log_sigma - K.square(z_mean) - 
K.exp(z_log_sigma), axis=-1)
return  xent_loss+kl_loss
return loss
model.compile( 
optimizer=optimizer,loss=custom_loss_wrapper(z_mean,z_log_sigma))

最新更新