将 keras 用于神经网络的两个输出的问题



我正在将 Keras 与 python 一起使用,我遇到了一个问题,在运行下面的代码时,我通常会得到两个准确率结果,10% 或 90%

import numpy as np
from numpy import loadtxt
from keras.models import Sequential
from keras.layers import Dense
ler = loadtxt(r'C:UsersMateusDesktopNovaartigo.csv')
ler_norm = ler / np.sqrt(np.sum(ler**2))
entrada = ler_norm[:,0:3]
saida = ler[:,3:5]
model = Sequential()
model.add(Dense(units = 3, input_dim = 3, activation='relu'))
model.add(Dense(units = 2, activation = 'sigmoid'))
model.compile(loss='mean_absolute_error', optimizer='adam', metrics=['accuracy'])
model.fit(entrada, saida, epochs=100, batch_size=10)
_, accuracy = model.evaluate(entrada, saida)
print('Accuracy: {:.2f}%'.format(accuracy*100))

"entrada" e "saida" 中使用的部分值(原始数据库有 300x5(:

68|541|257|72.9|84.0
102|576|322|73.6|84.8
54|528|315|73.6|84.0
99|435|357|73.7|84.0
95|454|115|73.1|83.5
91|300|140|73.5|82.5
118|362|144|73.6|85.0
118|450|233|73.4|83.5
93|378|121|73.7|84.0
95|403|117|73.3|84.0
131|349|80|73.4|85.0
112|467|257|74.0|83.5
50|463|134|73.2|83.5
97|374|159|73.3|85.0

最后 15 个纪元:

Epoch 85/100
300/300 [==============================] - 0s 177us/step - loss: 77.1145 - acc: 0.4867
Epoch 86/100
300/300 [==============================] - 0s 167us/step - loss: 77.1126 - acc: 0.5400
Epoch 87/100
300/300 [==============================] - 0s 157us/step - loss: 77.1108 - acc: 0.5600
Epoch 88/100
300/300 [==============================] - 0s 159us/step - loss: 77.1091 - acc: 0.6200
Epoch 89/100
300/300 [==============================] - 0s 167us/step - loss: 77.1073 - acc: 0.6733
Epoch 90/100
300/300 [==============================] - 0s 171us/step - loss: 77.1057 - acc: 0.5333
Epoch 91/100
300/300 [==============================] - 0s 157us/step - loss: 77.1040 - acc: 0.4600
Epoch 92/100
300/300 [==============================] - 0s 164us/step - loss: 77.1024 - acc: 0.5333
Epoch 93/100
300/300 [==============================] - 0s 176us/step - loss: 77.1008 - acc: 0.4800
Epoch 94/100
300/300 [==============================] - 0s 160us/step - loss: 77.0992 - acc: 0.5400
Epoch 95/100
300/300 [==============================] - 0s 150us/step - loss: 77.0977 - acc: 0.6067
Epoch 96/100
300/300 [==============================] - 0s 166us/step - loss: 77.0962 - acc: 0.5133
Epoch 97/100
300/300 [==============================] - 0s 168us/step - loss: 77.0947 - acc: 0.5400
Epoch 98/100
300/300 [==============================] - 0s 150us/step - loss: 77.0933 - acc: 0.4067
Epoch 99/100
300/300 [==============================] - 0s 164us/step - loss: 77.0919 - acc: 0.5267
Epoch 100/100
300/300 [==============================] - 0s 166us/step - loss: 77.0905 - acc: 0.5067

有谁知道出了什么问题? 感谢您的聆听

您正在尝试预测一个大约 80 的连续值,并具有Sigmoid激活,这将为您提供介于 0 和 1 之间的输出。
尝试使用线性或 relu 激活:

model.add(Dense(units = 1, activation = 'linear'))
model.add(Dense(units = 1, activation = 'relu'))

此外,在回归问题中使用accuracy是没有意义的,请尝试使用类似mae.

model.compile(loss='mean_absolute_error', optimizer='adam', metrics=['mae'])

最新更新