我试图在maxpooling检索错误之前在两个cnn层之后添加跳过连接。下面是我的示例代码:
X=Input(shape=(256, 256, 3))
X_shortcut = X
layer_in = Conv2D(64,(3, 3), padding='same', activation='relu')(X_shortcut)
X = Add()([X, X_shortcut])
layer_in = Conv2D(64, (3, 3), padding='same', activation='relu')(layer_in)
X = Add()([layer_in, X_shortcut])
layer_in = MaxPooling2D((2, 2), strides=(2, 2))(layer_in)
model = Model(inputs=X_shortcut, outputs=layer_in)
# summarize model
model.summary()
检索值错误:
ValueError: Operands could not be broadcast together with shapes (256, 256, 64) (256, 256, 3)
layer_in有64个通道,X_shortcut有3个通道。把它们加起来是不可能的。您可以连接,输出形状将是(256,256,67)