如何在Dense或Flatten图层后应用Conv1D: ValueError:形状(1,1,3)和(1,1)不兼容.&



如何在密集或平坦层后应用conv1D层?

给出的错误如下:

ValueError: Shapes (1,1,3) and (1,1) are incompatible.

数据集不是时间序列。请不要建议改变层的位置。输入数据有1000行50个特征。输出y是多类别[0,1,2]

下面是一个示例代码:
from keras.layers import Flatten
from keras.layers.convolutional import Conv1D
from keras.layers.convolutional import MaxPooling1D
from keras.models import Sequential
from keras.layers import Dense
from keras.layers import Dropout
from keras.utils import to_categorical
import numpy as np
from sklearn.model_selection import train_test_split
import tensorflow as tf
tf.get_logger().setLevel('ERROR')
verbose, epochs, batch_size = 0, 10, 1
x=np.random.randint(-10,10,(1000,50,1)).astype(float)
y=np.random.randint(0,3,(1000,1,1))
train_x, test_x, train_y, test_y = train_test_split(x, y, test_size=0.15, random_state=17)
train_y = to_categorical(train_y)
test_y = to_categorical(test_y)
n_features, n_outputs = train_x.shape[1], train_y.shape[1]
          
model = Sequential()
model.add(Dense(n_features, activation= 'relu'))
model.add(Conv1D(filters=64, kernel_size=3, activation='relu'))
model.add(Dropout(0.5))
model.add(MaxPooling1D(pool_size=2))
model.add(Flatten())
model.add(Dense(10, activation= 'relu'))
model.add(Dropout(0.2))
model.add(Dense(5, activation= 'relu'))
model.add(Dropout(0.2))
model.add(Dense(n_outputs, activation='softmax'))
t=time.time()
model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy'])
history=model.fit(train_x, train_y, epochs=epochs, batch_size=batch_size, verbose=verbose)
_, accuracy = model.evaluate(test_x, test_y, batch_size=batch_size, verbose=verbose)

print(accuracy)

我调试了您的问题,问题不在于在您发布的问题的密集层之后应用的conv1D,而是与您的最后一层。

当你在做多类分类时,你的输出层应该是类的数量,在你的例子中是3

你的输出train_ytest_y3D形状代替2D形状即(batch_size, num_classes)

所以一旦你重塑你的train_ytest_y,并改变n_outputs在最后一层,它会为你工作。为了方便起见,我将代码粘贴在下面。我已经检查了代码,它正在工作。

from keras.layers import Flatten
from keras.layers.convolutional import Conv1D
from keras.layers.convolutional import MaxPooling1D
from keras.models import Sequential
from keras.layers import Dense
from keras.layers import Dropout
from keras.utils import to_categorical
import numpy as np
from sklearn.model_selection import train_test_split
import tensorflow as tf
tf.get_logger().setLevel('ERROR')
verbose, epochs, batch_size = 0, 10, 1
x=np.random.randint(-10,10,(1000,50,1)).astype(float)
y=np.random.randint(0,3,(1000,1,1))
train_x, test_x, train_y, test_y = train_test_split(x, y, test_size=0.15, random_state=17)
train_y = to_categorical(train_y)
test_y = to_categorical(test_y)
n_features, n_outputs = train_x.shape[1], train_y.shape[1]
          
train_y = train_y.reshape((train_y.shape[0], train_y.shape[2]))
print(train_y.shape)
test_y = test_y.reshape((test_y.shape[0], test_y.shape[2]))
print(test_y.shape)
model = Sequential()
model.add(Dense(n_features, activation= 'relu'))
model.add(Conv1D(filters=64, kernel_size=3, activation='relu'))
model.add(Dropout(0.5))
model.add(MaxPooling1D(pool_size=2))
model.add(Flatten())
model.add(Dense(10, activation= 'relu'))
model.add(Dropout(0.2))
model.add(Dense(5, activation= 'relu'))
model.add(Dropout(0.2))
model.add(Dense(3, activation='softmax'))

t=time.time()
model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy'])
history=model.fit(train_x, train_y, epochs=epochs, batch_size=batch_size, verbose=verbose)
_, accuracy = model.evaluate(test_x, test_y, batch_size=batch_size, verbose=verbose)

print(accuracy)

相关内容

最新更新