在 Keras/Tensorflow 中add_loss的两种不同方法不能一起使用吗?



我写了下面的代码,通过自动编码器做一个简单的实验,我只想用两个损耗,第一个损耗是从AE的潜在向量重建的输入和输出的传统MSE损耗,第二个损耗是从编码器和解码器中对称层的两个输出之间的MSE损耗, 也就是说,如果AE有5层,我想在第二层和第四层之间增加MSE损耗,因为它们是对称的。代码在这里:

from time import time
import numpy as np
import random
from keras.models import Model
import keras.backend as K
from keras.engine.topology import Layer, InputSpec
from keras.layers import Dense, Input, GaussianNoise, Layer, Activation
from keras.models import Model
from keras.optimizers import SGD, Adam
from keras.utils.vis_utils import plot_model
from keras.callbacks import EarlyStopping
#build vae model
input_place = Input(shape=(128,))
e_layer1 = Dense(64,activation='relu')(input_place)
e_layer2 = Dense(32,activation='relu')(e_layer1)
hidden = Dense(16,activation='relu')(e_layer2)
d_layer1 = Dense(32,activation='relu')(hidden)
d_layer2 = Dense(64,activation='relu')(d_layer1)
output_place = Dense(128,activation='sigmoid')(d_layer2)
model = Model(inputs=input_place,outputs=output_place)
loss = K.mean(K.square(d_layer1 - e_layer2),axis = -1)
model.add_loss(loss)
model.compile(optimizer = 'adam',
loss=['mse'],
metrics=['accuracy'])
input_data = np.random.randn(400,128)
model.fit(input_data,
input_data,
batch_size = 32,
epochs=5)

但是当我运行此代码时,它发生了关于

Epoch 1/5
32/400 [=>............................] - ETA: 12s - loss: 1.6429 - acc: 0.0000e+00Traceback (most recent call last):
File "<ipython-input-49-eac3a65824ec>", line 1, in <module>
runfile('/Users/jishilun/Desktop/keras_loss_test.py', wdir='/Users/jishilun/Desktop')
File "/anaconda3/lib/python3.7/site-packages/spyder_kernels/customize/spydercustomize.py", line 704, in runfile
execfile(filename, namespace)
File "/anaconda3/lib/python3.7/site-packages/spyder_kernels/customize/spydercustomize.py", line 108, in execfile
exec(compile(f.read(), filename, 'exec'), namespace)
File "/Users/jishilun/Desktop/keras_loss_test.py", line 49, in <module>
epochs=5)
File "/anaconda3/lib/python3.7/site-packages/keras/engine/training.py", line 1039, in fit
validation_steps=validation_steps)
File "/anaconda3/lib/python3.7/site-packages/keras/engine/training_arrays.py", line 204, in fit_loop
callbacks.on_batch_end(batch_index, batch_logs)
File "/anaconda3/lib/python3.7/site-packages/keras/callbacks.py", line 115, in on_batch_end
callback.on_batch_end(batch, logs)
File "/anaconda3/lib/python3.7/site-packages/keras/callbacks.py", line 236, in on_batch_end
self.totals[k] += v * batch_size
ValueError: operands could not be broadcast together with shapes (32,) (16,) (32,) 

如果我删除add_loss,代码可以运行,所以我认为在 Keras/Tensorflow 中add_loss的两种方法不能简单地一起使用,或者它可能有一些变化(也许问题发生在小批量? 请帮帮我!欢迎任何意见或建议! 谢谢!

问题不在于add_loss(),而在于您的batch_size。您的输入数据是(400,128),但batch_size是 32。尝试将其更改为因式分解数 400,例如 40 或 20,它将起作用。

相关内容

  • 没有找到相关文章

最新更新