Keras 在处理单词嵌入时失败


word_embed = keras.layers.Embedding(len(vocab), 101)
em = word_embed(tf.convert_to_tensor(np.random.rand(10, 48)))
print(em.shape) 
# (10, 48, 101)    
# 10 sentences in bacth, 48 words in sentence(with padding), 101 - embedding dimension...
lstm = K.layers.LSTMCell(101)
lstm_layer = keras.layers.RNN(lstm)hidden_states = lstm_layer(em) 
# TypeError: Expected Operation, Variable, or Tensor, got False

谁可以提供帮助,为什么会发生错误?预期的 lstm 单元的隐藏状态...

您可以在模型中使用嵌入到 LSTM 层一词,如下所示。

在这里,我正在考虑二元分类示例,嵌入维度为 101,句子中的最大单词数为 48。

import tensorflow as tf
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Embedding,Dense,LSTM  
model = Sequential()
model.add(Embedding(len(vocab), 101, input_length=48))
model.add(LSTM(100, dropout=0.2, recurrent_dropout=0.2))
model.add(Dense(1, activation='sigmoid'))
model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])
## Fit the model
model.fit(data, np.array(labels), validation_split=0.4, epochs=3)

最新更新