递归神经网络



有人能给我解释一下layer_size超参数在这个循环神经网络模型中的作用吗?

###RNN MODEL TESTING BINARY CLASSIFICATION MODEL
batch_size = 32
epochs = 10
layer_size = 256
drop_out = 0.001
if True: 
model = Sequential()
model.add(LSTM(layer_size,input_shape =(30, 1), return_sequences=True ))
model.add(Dropout(drop_out))
model.add(LSTM(layer_size*2,return_sequences=True))
model.add(Dropout(drop_out))
model.add(LSTM(layer_size,return_sequences=False))
model.add(Dropout(drop_out))
model.add(Dense(1))
model.add(Activation('sigmoid'))
model.compile(loss='binary_crossentropy', optimizer=keras.optimizers.Adam() , metrics=['accuracy'])
model.summary()
fit=model.fit(X_train_kb2_keras, y_train2_kb2_keras, batch_size=batch_size, epochs=epochs, validation_split=0.20)
y_pred = model.predict_classes(X_test_keras)
print("Accuracy",accuracy_score(y_test2_kb2_keras,y_pred ))
print("precision_score",precision_score(y_test2_kb2_keras,y_pred ))
print("recall",recall_score(y_test2_kb2_keras,y_pred ))

根据文档,它是输出的维数。https://keras.io/api/layers/recurrent_layers/lstm/(layer_size对应文档中的unit)

所以最有可能的输出LSTM函数应该包含256个元素,我相信。验证这一点的一种方法是取其中的一部分并执行并计算出它的形状

temp = LSTM(layer_size,input_shape =(30, 1), return_sequences=True )
print(temp.shape)
print(temp)

最新更新