Loss函数返回负值的原因是什么?



我用Keras为入侵检测系统做了一个CNN模型。我对我的损失结果有意见。如何修复我的代码,请。这些是数据集的形状:

x_train shape: (1131151, 79)
y_train shape: (1131151, 2)
x_test shape: (53386, 79)
y_test shape: (53386, 2)
train shape after reshape: (1131151, 79, 1)
test shape after reshape: (53386, 79, 1)
我的<<p> strong> 是:
model = Sequential()
#convolution 1st layer
model.add(Conv1D(32, kernel_size=(filter_size), padding="same",
activation='relu',
input_shape=(X_train.shape[1], 1)))
model.add(BatchNormalization())
model.add(Dropout(droprate))

#convolution 2nd layer
model.add(Conv1D(64, kernel_size=(filter_size), activation='relu', padding="same"))
model.add(BatchNormalization())
model.add(MaxPooling1D(strides=1))
model.add(Dropout(droprate))
#convolution 3rd layer
model.add(Conv1D(128, kernel_size=(filter_size), activation='relu', padding="same"))
model.add(BatchNormalization())
model.add(MaxPooling1D(strides=1))
model.add(Dropout(droprate))
#FCN 1st layer
model.add(Flatten())
model.add(Dense(128,use_bias=False))
model.add(BatchNormalization())
model.add(Activation('relu'))
model.add(Dropout(droprate))
#FCN 2nd layer
model.add(Dense(32,use_bias=False))
model.add(BatchNormalization())
model.add(Activation('relu'))
model.add(Dropout(droprate))
#FCN 3rd layer
model.add(Dense(16,use_bias=False))
model.add(BatchNormalization())
model.add(Activation('relu'))
model.add(Dropout(droprate))
#FCN final layer
model.add(Dense(2))
model.add(Activation('sigmoid'))
model.compile(loss="binary_crossentropy", optimizer="Adam", metrics=['accuracy'])
model.summary()

结果如下:

score = model.evaluate(X_test, preprocess.y_test, verbose=0)
Test loss: 9.198558109346777e-05
Test accuracy: 0.9999812841415405

可以看到,损失函数为负,准确率接近100%我的代码有什么问题?

测试损耗:9.198558109346777e-05该值不能为负。这是10的-5次方或0.00009198558的科学形式。

最新更新