我试图在Numpy中重新创建Keras的BatchNormalizing层:(Python)
model = Sequential()
model.add(BatchNormalization(axis=1, center=False, scale=False))
model.compile(optimizer='adam', loss='mse', metrics=['mse'])
scale = np.linspace(0, 100, 1000)
x_train = np.sin(scale) + 2.5
y_train = np.sin(scale)
print(x_train.shape)
print(x_train.shape)
model.fit(x_train, y_train, epochs=100, batch_size=100, shuffle=True, verbose=2)
x_test = np.array([1])
mean = model.layers[0].get_weights()[0]
var = model.layers[0].get_weights()[1]
print('mean', np.mean(x_train), 'mean_tf', mean)
print('var', np.var(x_train), 'var_tf', var)
print('result_tf', model.predict(x_test))
print('result_pred', (x_test - mean) / var)
为什么我没有得到相同的结果?
居中且缩放=True时相同。但我想让它保持简单。其他的图层,比如dense或者LSTM,我已经有了
尝试print('result_pred', (x_test - mean)/np.sqrt(var)),并进一步解释检查我在这个答案中的编辑stackoverflow.com/a/65744394/10733051