测试Keras序列神经网络-预测完全不准确



我第一次在Python中使用Keras和TensorFlow,并希望用它来创建一个纸牌游戏的计算机播放器。我有以下测试代码来证明我理解如何获得基本的神经网络设置,但预测并不是我所期望的——它们与输入数据中的结果没有相似之处。有时预测全是1,有时全是0。

我的测试代码:

from keras.models import Sequential
from keras.layers import Dense
# Define the keras model
model = Sequential()
model.add(Dense(6, input_dim=3, activation='relu'))
model.add(Dense(3, activation='relu'))
model.add(Dense(1, activation='sigmoid'))
model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])
trainingData = [[10,1900,1,1], # Features, Choice, Outcomes
[20,1800,1,1], 
[90,1000,1,0],
[80,1100,1,0],
[10,1900,0,0],
[20,1800,0,0],
[90,1000,0,1],
[80,1100,0,1],
]
# Split training data into features and outcomes
features = []
outcomes = []
for trainingDataRow in trainingData:
features.append(trainingDataRow[:-1])
outcomes.append(trainingDataRow[-1])

# fit the keras model on the dataset
model.fit(features, outcomes, epochs=15, batch_size=8)
# evaluate the keras model acuracy
_, accuracy = model.evaluate(features, outcomes)
print('Accuracy: %.2f' % (accuracy*100))
# Test model predictions against training data
predictions = model.predict_classes(features)
for i in range(0,len(features)):
print('%s => %d (expected %d)' % (features[i], predictions[i], outcomes[i]))
# Print Text decission Tree
print(model.summary())

输出:

Epoch 1/15
1/1 - ETA: 0s - loss: 265.5195 - accuracy: 0.5000
1/1 -  15s 15s/step - loss: 265.5195 - accuracy: 0.5000
Epoch 2/15
1/1 - ETA: 0s - loss: 263.9588 - accuracy: 0.5000
1/1 - 0s 16ms/step - loss: 263.9588 - accuracy: 0.5000
Epoch 3/15
1/1 - ETA: 0s - loss: 262.4041 - accuracy: 0.5000
1/1 - 0s 16ms/step - loss: 262.4041 - accuracy: 0.5000
Epoch 4/15
[etc]
1/1 - ETA: 0s - loss: 248.6939 - accuracy: 0.5000
1/1   0s 16ms/step - loss: 248.6939 - accuracy: 0.5000
Epoch 13/15
1/1 - ETA: 0s - loss: 247.2033 - accuracy: 0.5000
1/1 - 0s 16ms/step - loss: 247.2033 - accuracy: 0.5000
Epoch 14/15
1/1 - ETA: 0s - loss: 245.7196 - accuracy: 0.5000
1/1 - 0s 16ms/step - loss: 245.7196 - accuracy: 0.5000
Epoch 15/15
1/1 - ETA: 0s - loss: 244.2428 - accuracy: 0.5000
1/1 - 0s 16ms/step - loss: 244.2428 - accuracy: 0.5000
1/1 - ETA: 0s - loss: 242.7730 - accuracy: 0.5000
1/1 - 0s 445ms/step - loss: 242.7730 - accuracy: 0.5000
Accuracy: 50.00
[20, 1800, 1] => 1 (expected 1)
[90, 1000, 1] => 1 (expected 0)
[80, 1100, 1] => 1 (expected 0)
[10, 1900, 0] => 1 (expected 0)
[20, 1800, 0] => 1 (expected 0)
[90, 1000, 0] => 1 (expected 1)
[80, 1100, 0] => 1 (expected 1)
Model: "sequential"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense (Dense)                (None, 6)                 24        
_________________________________________________________________
dense_1 (Dense)              (None, 3)                 21        
_________________________________________________________________
dense_2 (Dense)              (None, 1)                 4         
=================================================================
Total params: 49
Trainable params: 49
Non-trainable params: 0
_________________________________________________________________
None

感谢所有的帮助和建议。

根据下面的对话,我更新了代码,其中包含更多的输入数据、更多的epoch和其他更改。尽管如此,即使在最后10个时代,准确度的变化仍然很大——在56%到83%之间。关于我应该如何改进这一点的建议?

代码:

from keras.models import Sequential
from keras.layers import Dense
# Define the keras model
model = Sequential()
model.add(Dense(12, input_dim=3, activation='relu'))
model.add(Dense(6, activation='relu'))
model.add(Dense(1, activation='sigmoid'))
model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])
trainingData = [[10,1000,0,0],
[11,1001,0,0],
[12,1002,0,0],
[13,1003,0,0],
[14,1004,0,0],
[15,1005,0,0],
[16,1006,0,0],
[17,1007,0,0],
[18,1008,0,0],
[19,1009,0,0],
[20,1010,0,0],
[21,1011,0,0],
[22,1012,0,0],
[23,1013,0,0],
[24,1014,0,0],
[25,1015,0,0],
[26,1016,0,0],
[27,1017,0,0],
[28,1018,0,0],
[29,1019,0,0],
[30,1020,0,0],
[10,1000,1,0],
[11,1001,1,0],
[12,1002,1,0],
[13,1003,1,0],
[14,1004,1,0],
[15,1005,1,0],
[16,1006,1,0],
[17,1007,1,0],
[18,1008,1,0],
[19,1009,1,0],
[20,1010,1,0],
[21,1011,1,0],
[22,1012,1,0],
[23,1013,1,0],
[24,1014,1,0],
[25,1015,1,0],
[26,1016,1,0],
[27,1017,1,0],
[28,1018,1,0],
[29,1019,1,0],
[30,1020,1,0],
[10,9000,0,0],
[11,9001,0,0],
[12,9002,0,0],
[13,9003,0,0],
[14,9004,0,0],
[15,9005,0,0],
[16,9006,0,0],
[17,9007,0,0],
[18,9008,0,0],
[19,9009,0,0],
[20,9010,0,0],
[21,9011,0,0],
[22,9012,0,0],
[23,9013,0,0],
[24,9014,0,0],
[25,9015,0,0],
[26,9016,0,0],
[27,9017,0,0],
[28,9018,0,0],
[29,9019,0,0],
[30,9020,0,0],
[10,9021,1,1],
[11,9022,1,1],
[12,9023,1,1],
[13,9024,1,1],
[14,9025,1,1],
[15,9026,1,1],
[16,9027,1,1],
[17,9028,1,1],
[18,9029,1,1],
[19,9030,1,1],
[20,9031,1,1],
[21,9032,1,1],
[22,9033,1,1],
[23,9034,1,1],
[24,9035,1,1],
[25,9036,1,1],
[26,9037,1,1],
[27,9038,1,1],
[28,9039,1,1],
[29,9040,1,1],
[30,9041,1,1],
[70,1000,0,1],
[71,1001,0,1],
[72,1002,0,1],
[73,1003,0,1],
[74,1004,0,1],
[75,1005,0,1],
[76,1006,0,1],
[77,1007,0,1],
[78,1008,0,1],
[79,1009,0,1],
[80,1010,0,1],
[81,1011,0,1],
[82,1012,0,1],
[83,1013,0,1],
[84,1014,0,1],
[85,1015,0,1],
[86,1016,0,1],
[87,1017,0,1],
[88,1018,0,1],
[89,1019,0,1],
[90,1020,0,1],
[70,1000,1,0],
[71,1001,1,0],
[72,1002,1,0],
[73,1003,1,0],
[74,1004,1,0],
[75,1005,1,0],
[76,1006,1,0],
[77,1007,1,0],
[78,1008,1,0],
[79,1009,1,0],
[80,1010,1,0],
[81,1011,1,0],
[82,1012,1,0],
[83,1013,1,0],
[84,1014,1,0],
[85,1015,1,0],
[86,1016,1,0],
[87,1017,1,0],
[88,1018,1,0],
[89,1019,1,0],
[90,1020,1,0],
[70,9000,0,1],
[71,9001,0,1],
[72,9002,0,1],
[73,9003,0,1],
[74,9004,0,1],
[75,9005,0,1],
[76,9006,0,1],
[77,9007,0,1],
[78,9008,0,1],
[79,9009,0,1],
[80,9010,0,1],
[81,9011,0,1],
[82,9012,0,1],
[83,9013,0,1],
[84,9014,0,1],
[85,9015,0,1],
[86,9016,0,1],
[87,9017,0,1],
[88,9018,0,1],
[89,9019,0,1],
[90,9020,0,1],
[70,9000,1,1],
[71,9001,1,1],
[72,9002,1,1],
[73,9003,1,1],
[74,9004,1,1],
[75,9005,1,1],
[76,9006,1,1],
[77,9007,1,1],
[78,9008,1,1],
[79,9009,1,1],
[80,9010,1,1],
[81,9011,1,1],
[82,9012,1,1],
[83,9013,1,1],
[84,9014,1,1],
[85,9015,1,1],
[86,9016,1,1],
[87,9017,1,1],
[88,9018,1,1],
[89,9019,1,1],
[90,9020,1,1]]
# Split training data into features and outcomes
features = []
outcomes = []
for trainingDataRow in trainingData:
features.append(trainingDataRow[:-1])
outcomes.append(trainingDataRow[-1])
#print(features)
#print(outcomes)
# fit the keras model on the dataset
model.fit(features, outcomes, epochs=500, verbose=2)
# evaluate the keras model acuracy
_, accuracy = model.evaluate(features, outcomes)
print('Accuracy: %.2f' % (accuracy*100))
# Test model predictions against training data
#predictions = model.predict_classes(features)
predictions = (model.predict(features) > 0.5).astype("int32")
for i in range(0,160,4):
print('%s => %d (expected %d)' % (features[i], predictions[i], outcomes[i]))

# Print Text decission Tree
print(model.summary())

输出:

Epoch 1/500
6/6 - 16s - loss: 1354.8107 - accuracy: 0.5000
Epoch 2/500
6/6 - 0s - loss: 1178.2457 - accuracy: 0.5000
Epoch 3/500
6/6 - 0s - loss: 999.3210 - accuracy: 0.5000
Epoch 4/500
6/6 - 0s - loss: 821.8029 - accuracy: 0.5000
Epoch 5/500
6/6 - 0s - loss: 651.4391 - accuracy: 0.5000
Epoch 6/500
6/6 - 0s - loss: 474.1348 - accuracy: 0.5000
Epoch 7/500
6/6 - 0s - loss: 312.3422 - accuracy: 0.5000
Epoch 8/500
6/6 - 0s - loss: 136.2285 - accuracy: 0.5000
Epoch 9/500
6/6 - 0s - loss: 18.8638 - accuracy: 0.4821
Epoch 10/500
6/6 - 0s - loss: 55.3453 - accuracy: 0.5000
Epoch 11/500
6/6 - 0s - loss: 62.0218 - accuracy: 0.5000
Epoch 12/500
6/6 - 0s - loss: 48.6303 - accuracy: 0.5000
Epoch 13/500
6/6 - 0s - loss: 24.6552 - accuracy: 0.5000
Epoch 14/500
6/6 - 0s - loss: 12.4640 - accuracy: 0.4524
Epoch 15/500
6/6 - 0s - loss: 11.6526 - accuracy: 0.3988
Epoch 16/500
6/6 - 0s - loss: 10.0779 - accuracy: 0.5000
Epoch 17/500
6/6 - 0s - loss: 4.5000 - accuracy: 0.3690
Epoch 18/500
6/6 - 0s - loss: 4.4624 - accuracy: 0.4940
Epoch 19/500
6/6 - 0s - loss: 3.5397 - accuracy: 0.4702
Epoch 20/500
6/6 - 0s - loss: 3.1084 - accuracy: 0.5060
Epoch 21/500
6/6 - 0s - loss: 3.3269 - accuracy: 0.4762
Epoch 22/500
6/6 - 0s - loss: 5.0356 - accuracy: 0.5000
Epoch 23/500
6/6 - 0s - loss: 4.5903 - accuracy: 0.3988
Epoch 24/500
6/6 - 0s - loss: 4.0782 - accuracy: 0.5238
Epoch 25/500
6/6 - 0s - loss: 4.2735 - accuracy: 0.4048
Epoch 26/500
6/6 - 0s - loss: 3.4100 - accuracy: 0.4048
Epoch 27/500
6/6 - 0s - loss: 3.4263 - accuracy: 0.4702
Epoch 28/500
6/6 - 0s - loss: 3.0845 - accuracy: 0.4881
Epoch 29/500
6/6 - 0s - loss: 3.6671 - accuracy: 0.4226
Epoch 30/500
6/6 - 0s - loss: 3.2743 - accuracy: 0.4881
Epoch 31/500
6/6 - 0s - loss: 2.7294 - accuracy: 0.5060
Epoch 32/500
6/6 - 0s - loss: 2.6796 - accuracy: 0.4702
Epoch 33/500
6/6 - 0s - loss: 2.3139 - accuracy: 0.5119
Epoch 34/500
6/6 - 0s - loss: 2.2984 - accuracy: 0.4821
Epoch 35/500
6/6 - 0s - loss: 2.5401 - accuracy: 0.4881
Epoch 36/500
6/6 - 0s - loss: 2.5181 - accuracy: 0.4881
Epoch 37/500
6/6 - 0s - loss: 2.2515 - accuracy: 0.4940
Epoch 38/500
6/6 - 0s - loss: 2.1356 - accuracy: 0.4821
Epoch 39/500
6/6 - 0s - loss: 2.0135 - accuracy: 0.4643
Epoch 40/500
6/6 - 0s - loss: 1.9647 - accuracy: 0.4821
Epoch 41/500
6/6 - 0s - loss: 2.4048 - accuracy: 0.3869
Epoch 42/500
6/6 - 0s - loss: 4.4917 - accuracy: 0.4940
Epoch 43/500
6/6 - 0s - loss: 2.9583 - accuracy: 0.4286
Epoch 44/500
6/6 - 0s - loss: 2.0616 - accuracy: 0.4048
Epoch 45/500
6/6 - 0s - loss: 2.2955 - accuracy: 0.4643
Epoch 46/500
6/6 - 0s - loss: 2.6275 - accuracy: 0.4107
Epoch 47/500
6/6 - 0s - loss: 2.1440 - accuracy: 0.4762
Epoch 48/500
6/6 - 0s - loss: 2.2715 - accuracy: 0.4643
Epoch 49/500
6/6 - 0s - loss: 1.6154 - accuracy: 0.4762
Epoch 50/500
6/6 - 0s - loss: 1.3189 - accuracy: 0.5179
[etc]
Epoch 100/500
6/6 - 0s - loss: 0.9070 - accuracy: 0.7143
Epoch 101/500
6/6 - 0s - loss: 1.2207 - accuracy: 0.6786
Epoch 102/500
6/6 - 0s - loss: 1.7759 - accuracy: 0.6429
Epoch 103/500
6/6 - 0s - loss: 2.3393 - accuracy: 0.6012
Epoch 104/500
6/6 - 0s - loss: 6.1804 - accuracy: 0.5000
Epoch 105/500
6/6 - 0s - loss: 4.0825 - accuracy: 0.5833
Epoch 106/500
6/6 - 0s - loss: 2.5547 - accuracy: 0.5060
Epoch 107/500
6/6 - 0s - loss: 2.1338 - accuracy: 0.6131
Epoch 108/500
6/6 - 0s - loss: 0.8321 - accuracy: 0.7619
Epoch 109/500
6/6 - 0s - loss: 1.0405 - accuracy: 0.6964
Epoch 110/500
6/6 - 0s - loss: 1.4533 - accuracy: 0.6190
Epoch 111/500
6/6 - 0s - loss: 2.4647 - accuracy: 0.5417
Epoch 112/500
6/6 - 0s - loss: 2.4917 - accuracy: 0.5476
Epoch 113/500
6/6 - 0s - loss: 2.2088 - accuracy: 0.5357
Epoch 114/500
6/6 - 0s - loss: 1.5922 - accuracy: 0.5952
Epoch 115/500
6/6 - 0s - loss: 0.9498 - accuracy: 0.6488
Epoch 116/500
6/6 - 0s - loss: 0.7612 - accuracy: 0.6905
Epoch 117/500
6/6 - 0s - loss: 0.9210 - accuracy: 0.7917
Epoch 118/500
6/6 - 0s - loss: 1.5319 - accuracy: 0.6012
Epoch 119/500
6/6 - 0s - loss: 2.8555 - accuracy: 0.5179
Epoch 120/500
6/6 - 0s - loss: 1.6090 - accuracy: 0.6131
Epoch 121/500
6/6 - 0s - loss: 1.9316 - accuracy: 0.5119
Epoch 122/500
6/6 - 0s - loss: 3.9357 - accuracy: 0.5060
Epoch 123/500
6/6 - 0s - loss: 1.2138 - accuracy: 0.6845
Epoch 124/500
6/6 - 0s - loss: 0.9364 - accuracy: 0.6726
Epoch 125/500
6/6 - 0s - loss: 1.7571 - accuracy: 0.6429
Epoch 126/500
6/6 - 0s - loss: 3.0547 - accuracy: 0.5298
Epoch 127/500
6/6 - 0s - loss: 0.5857 - accuracy: 0.7857
Epoch 128/500
6/6 - 0s - loss: 1.0959 - accuracy: 0.6964
Epoch 129/500
6/6 - 0s - loss: 2.6186 - accuracy: 0.4940
Epoch 130/500
6/6 - 0s - loss: 2.3914 - accuracy: 0.6250
Epoch 131/500
6/6 - 0s - loss: 1.3078 - accuracy: 0.6607
Epoch 132/500
6/6 - 0s - loss: 1.1346 - accuracy: 0.6548
Epoch 133/500
6/6 - 1s - loss: 0.9108 - accuracy: 0.7679
Epoch 134/500
6/6 - 0s - loss: 0.9480 - accuracy: 0.7024
Epoch 135/500
6/6 - 0s - loss: 2.3744 - accuracy: 0.5298
Epoch 136/500
6/6 - 0s - loss: 0.9875 - accuracy: 0.7262
Epoch 137/500
6/6 - 0s - loss: 3.0520 - accuracy: 0.5536
Epoch 138/500
6/6 - 0s - loss: 3.3804 - accuracy: 0.6250
Epoch 139/500
6/6 - 0s - loss: 2.2378 - accuracy: 0.6607
Epoch 140/500
6/6 - 0s - loss: 0.7141 - accuracy: 0.7202
Epoch 141/500
6/6 - 0s - loss: 1.0920 - accuracy: 0.6964
Epoch 142/500
6/6 - 0s - loss: 1.3448 - accuracy: 0.6905
Epoch 143/500
6/6 - 0s - loss: 1.8249 - accuracy: 0.6190
Epoch 144/500
6/6 - 0s - loss: 1.2369 - accuracy: 0.7321
Epoch 145/500
6/6 - 0s - loss: 1.9031 - accuracy: 0.6488
Epoch 146/500
6/6 - 0s - loss: 2.8954 - accuracy: 0.5774
Epoch 147/500
6/6 - 0s - loss: 3.6607 - accuracy: 0.5298
Epoch 148/500
6/6 - 0s - loss: 0.6843 - accuracy: 0.7976
Epoch 149/500
6/6 - 0s - loss: 0.4969 - accuracy: 0.8393
Epoch 150/500
6/6 - 0s - loss: 0.5488 - accuracy: 0.8333
[etc]
Epoch 200/500
6/6 - 0s - loss: 1.2898 - accuracy: 0.7619
Epoch 201/500
6/6 - 0s - loss: 0.5927 - accuracy: 0.7976
Epoch 202/500
6/6 - 0s - loss: 1.0875 - accuracy: 0.7262
Epoch 203/500
6/6 - 0s - loss: 1.1926 - accuracy: 0.7440
Epoch 204/500
6/6 - 0s - loss: 1.4880 - accuracy: 0.6845
Epoch 205/500
6/6 - 0s - loss: 1.3070 - accuracy: 0.7143
Epoch 206/500
6/6 - 0s - loss: 1.0667 - accuracy: 0.7321
Epoch 207/500
6/6 - 0s - loss: 1.7004 - accuracy: 0.6548
Epoch 208/500
6/6 - 0s - loss: 1.7348 - accuracy: 0.6964
Epoch 209/500
6/6 - 0s - loss: 1.3299 - accuracy: 0.6845
Epoch 210/500
6/6 - 0s - loss: 1.6381 - accuracy: 0.6071
Epoch 211/500
6/6 - 0s - loss: 3.2871 - accuracy: 0.5357
Epoch 212/500
6/6 - 0s - loss: 1.2281 - accuracy: 0.6786
Epoch 213/500
6/6 - 0s - loss: 0.7263 - accuracy: 0.7619
Epoch 214/500
6/6 - 0s - loss: 0.5449 - accuracy: 0.8155
Epoch 215/500
6/6 - 0s - loss: 0.8705 - accuracy: 0.7440
Epoch 216/500
6/6 - 0s - loss: 0.6476 - accuracy: 0.7738
Epoch 217/500
6/6 - 0s - loss: 0.6375 - accuracy: 0.7917
Epoch 218/500
6/6 - 0s - loss: 0.8303 - accuracy: 0.7440
Epoch 219/500
6/6 - 0s - loss: 1.6169 - accuracy: 0.7262
Epoch 220/500
6/6 - 0s - loss: 0.9138 - accuracy: 0.7321
Epoch 221/500
6/6 - 0s - loss: 0.6194 - accuracy: 0.8036
Epoch 222/500
6/6 - 0s - loss: 1.0605 - accuracy: 0.6905
Epoch 223/500
6/6 - 0s - loss: 2.4083 - accuracy: 0.5893
Epoch 224/500
6/6 - 0s - loss: 1.3246 - accuracy: 0.7440
Epoch 225/500
6/6 - 0s - loss: 2.6301 - accuracy: 0.5179
Epoch 226/500
6/6 - 0s - loss: 2.8703 - accuracy: 0.6429
Epoch 227/500
6/6 - 0s - loss: 2.3044 - accuracy: 0.6131
Epoch 228/500
6/6 - 0s - loss: 1.3211 - accuracy: 0.7262
Epoch 229/500
6/6 - 0s - loss: 1.6195 - accuracy: 0.7440
Epoch 230/500
6/6 - 0s - loss: 2.7555 - accuracy: 0.5357
Epoch 231/500
6/6 - 0s - loss: 1.5666 - accuracy: 0.6667
Epoch 232/500
6/6 - 0s - loss: 1.1756 - accuracy: 0.7083
Epoch 233/500
6/6 - 0s - loss: 1.1853 - accuracy: 0.7143
Epoch 234/500
6/6 - 0s - loss: 1.6407 - accuracy: 0.7024
Epoch 235/500
6/6 - 0s - loss: 2.0056 - accuracy: 0.6369
Epoch 236/500
6/6 - 0s - loss: 1.3144 - accuracy: 0.6905
Epoch 237/500
6/6 - 0s - loss: 1.4989 - accuracy: 0.6667
Epoch 238/500
6/6 - 0s - loss: 1.8646 - accuracy: 0.6845
Epoch 239/500
6/6 - 0s - loss: 2.2994 - accuracy: 0.6369
Epoch 240/500
6/6 - 0s - loss: 1.6436 - accuracy: 0.6429
Epoch 241/500
6/6 - 0s - loss: 1.0493 - accuracy: 0.7321
Epoch 242/500
6/6 - 0s - loss: 1.2539 - accuracy: 0.6607
Epoch 243/500
6/6 - 0s - loss: 0.9008 - accuracy: 0.7321
Epoch 244/500
6/6 - 0s - loss: 1.7432 - accuracy: 0.6488
Epoch 245/500
6/6 - 0s - loss: 0.5311 - accuracy: 0.8095
Epoch 246/500
6/6 - 0s - loss: 0.6543 - accuracy: 0.8155
Epoch 247/500
6/6 - 0s - loss: 1.8998 - accuracy: 0.6726
Epoch 248/500
6/6 - 0s - loss: 2.6555 - accuracy: 0.6429
Epoch 249/500
6/6 - 0s - loss: 1.2809 - accuracy: 0.7083
Epoch 250/500
6/6 - 0s - loss: 1.3445 - accuracy: 0.7262
[etc]
Epoch 300/500
6/6 - 0s - loss: 1.1862 - accuracy: 0.7560
Epoch 301/500
6/6 - 0s - loss: 1.5770 - accuracy: 0.7143
Epoch 302/500
6/6 - 0s - loss: 1.4353 - accuracy: 0.7083
Epoch 303/500
6/6 - 0s - loss: 1.4100 - accuracy: 0.6786
Epoch 304/500
6/6 - 0s - loss: 2.4307 - accuracy: 0.6488
Epoch 305/500
6/6 - 0s - loss: 2.6659 - accuracy: 0.6310
Epoch 306/500
6/6 - 0s - loss: 1.9433 - accuracy: 0.6905
Epoch 307/500
6/6 - 0s - loss: 1.5349 - accuracy: 0.6250
Epoch 308/500
6/6 - 0s - loss: 1.6226 - accuracy: 0.6250
Epoch 309/500
6/6 - 0s - loss: 2.4500 - accuracy: 0.7202
Epoch 310/500
6/6 - 0s - loss: 1.4430 - accuracy: 0.7381
Epoch 311/500
6/6 - 0s - loss: 0.4379 - accuracy: 0.7976
Epoch 312/500
6/6 - 0s - loss: 0.6183 - accuracy: 0.7917
Epoch 313/500
6/6 - 0s - loss: 2.0482 - accuracy: 0.6310
Epoch 314/500
6/6 - 0s - loss: 3.3638 - accuracy: 0.5774
Epoch 315/500
6/6 - 0s - loss: 3.8912 - accuracy: 0.5655
Epoch 316/500
6/6 - 0s - loss: 3.9136 - accuracy: 0.6488
Epoch 317/500
6/6 - 0s - loss: 2.2531 - accuracy: 0.6905
Epoch 318/500
6/6 - 0s - loss: 1.3046 - accuracy: 0.6488
Epoch 319/500
6/6 - 0s - loss: 1.2162 - accuracy: 0.7202
Epoch 320/500
6/6 - 0s - loss: 1.0580 - accuracy: 0.7440
Epoch 321/500
6/6 - 0s - loss: 1.8373 - accuracy: 0.6786
Epoch 322/500
6/6 - 0s - loss: 1.8425 - accuracy: 0.6905
Epoch 323/500
6/6 - 0s - loss: 2.7649 - accuracy: 0.6607
Epoch 324/500
6/6 - 0s - loss: 3.4705 - accuracy: 0.5833
Epoch 325/500
6/6 - 0s - loss: 3.1626 - accuracy: 0.5595
Epoch 326/500
6/6 - 0s - loss: 2.6431 - accuracy: 0.6310
Epoch 327/500
6/6 - 0s - loss: 3.6114 - accuracy: 0.5833
Epoch 328/500
6/6 - 0s - loss: 0.9896 - accuracy: 0.7321
Epoch 329/500
6/6 - 0s - loss: 0.6469 - accuracy: 0.7976
Epoch 330/500
6/6 - 0s - loss: 1.2563 - accuracy: 0.7202
Epoch 331/500
6/6 - 0s - loss: 0.5271 - accuracy: 0.8274
Epoch 332/500
6/6 - 0s - loss: 1.1875 - accuracy: 0.6369
Epoch 333/500
6/6 - 0s - loss: 2.5915 - accuracy: 0.6607
Epoch 334/500
6/6 - 0s - loss: 3.2393 - accuracy: 0.6012
Epoch 335/500
6/6 - 0s - loss: 1.5667 - accuracy: 0.6726
Epoch 336/500
6/6 - 0s - loss: 3.3724 - accuracy: 0.5893
Epoch 337/500
6/6 - 0s - loss: 4.6063 - accuracy: 0.5595
Epoch 338/500
6/6 - 0s - loss: 4.2892 - accuracy: 0.5655
Epoch 339/500
6/6 - 0s - loss: 3.1177 - accuracy: 0.6429
Epoch 340/500
6/6 - 0s - loss: 1.5357 - accuracy: 0.6726
Epoch 341/500
6/6 - 0s - loss: 1.7998 - accuracy: 0.7321
Epoch 342/500
6/6 - 0s - loss: 3.4708 - accuracy: 0.5952
Epoch 343/500
6/6 - 0s - loss: 1.6397 - accuracy: 0.6369
Epoch 344/500
6/6 - 0s - loss: 1.0144 - accuracy: 0.7143
Epoch 345/500
6/6 - 0s - loss: 0.5396 - accuracy: 0.8095
Epoch 346/500
6/6 - 0s - loss: 0.4824 - accuracy: 0.8333
Epoch 347/500
6/6 - 0s - loss: 0.6122 - accuracy: 0.8095
Epoch 348/500
6/6 - 0s - loss: 0.9176 - accuracy: 0.7262
Epoch 349/500
6/6 - 0s - loss: 1.0516 - accuracy: 0.7083
Epoch 350/500
6/6 - 0s - loss: 3.0169 - accuracy: 0.6964
[etc]
Epoch 400/500
6/6 - 0s - loss: 1.4745 - accuracy: 0.6905
Epoch 401/500
6/6 - 0s - loss: 1.0852 - accuracy: 0.7560
Epoch 402/500
6/6 - 0s - loss: 1.3118 - accuracy: 0.7440
Epoch 403/500
6/6 - 0s - loss: 1.1269 - accuracy: 0.7143
Epoch 404/500
6/6 - 0s - loss: 0.7287 - accuracy: 0.7917
Epoch 405/500
6/6 - 0s - loss: 0.4720 - accuracy: 0.8155
Epoch 406/500
6/6 - 0s - loss: 0.6519 - accuracy: 0.7917
Epoch 407/500
6/6 - 0s - loss: 1.4163 - accuracy: 0.7202
Epoch 408/500
6/6 - 0s - loss: 1.5731 - accuracy: 0.6964
Epoch 409/500
6/6 - 0s - loss: 2.1105 - accuracy: 0.6905
Epoch 410/500
6/6 - 0s - loss: 2.0785 - accuracy: 0.6190
Epoch 411/500
6/6 - 0s - loss: 2.7981 - accuracy: 0.6012
Epoch 412/500
6/6 - 0s - loss: 2.8780 - accuracy: 0.6607
Epoch 413/500
6/6 - 0s - loss: 2.8910 - accuracy: 0.6369
Epoch 414/500
6/6 - 0s - loss: 2.9615 - accuracy: 0.6369
Epoch 415/500
6/6 - 0s - loss: 4.4401 - accuracy: 0.5536
Epoch 416/500
6/6 - 0s - loss: 3.5945 - accuracy: 0.6012
Epoch 417/500
6/6 - 0s - loss: 2.9201 - accuracy: 0.7024
Epoch 418/500
6/6 - 0s - loss: 0.6884 - accuracy: 0.7917
Epoch 419/500
6/6 - 0s - loss: 0.6324 - accuracy: 0.7798
Epoch 420/500
6/6 - 0s - loss: 0.4122 - accuracy: 0.8571
Epoch 421/500
6/6 - 0s - loss: 0.5619 - accuracy: 0.8095
Epoch 422/500
6/6 - 0s - loss: 0.7549 - accuracy: 0.7798
Epoch 423/500
6/6 - 0s - loss: 1.7936 - accuracy: 0.6667
Epoch 424/500
6/6 - 0s - loss: 1.0680 - accuracy: 0.7024
Epoch 425/500
6/6 - 0s - loss: 1.1780 - accuracy: 0.7202
Epoch 426/500
6/6 - 0s - loss: 1.4397 - accuracy: 0.6845
Epoch 427/500
6/6 - 0s - loss: 2.2156 - accuracy: 0.6905
Epoch 428/500
6/6 - 0s - loss: 3.7839 - accuracy: 0.5774
Epoch 429/500
6/6 - 0s - loss: 4.7977 - accuracy: 0.6012
Epoch 430/500
6/6 - 0s - loss: 2.8695 - accuracy: 0.6488
Epoch 431/500
6/6 - 0s - loss: 2.0530 - accuracy: 0.6667
Epoch 432/500
6/6 - 0s - loss: 1.2545 - accuracy: 0.7381
Epoch 433/500
6/6 - 0s - loss: 0.6428 - accuracy: 0.7619
Epoch 434/500
6/6 - 0s - loss: 0.5938 - accuracy: 0.7560
Epoch 435/500
6/6 - 0s - loss: 0.6733 - accuracy: 0.8274
Epoch 436/500
6/6 - 0s - loss: 0.6819 - accuracy: 0.7798
Epoch 437/500
6/6 - 0s - loss: 0.9188 - accuracy: 0.7738
Epoch 438/500
6/6 - 0s - loss: 0.7398 - accuracy: 0.7560
Epoch 439/500
6/6 - 0s - loss: 0.5840 - accuracy: 0.7857
Epoch 440/500
6/6 - 0s - loss: 0.5400 - accuracy: 0.7976
Epoch 441/500
6/6 - 0s - loss: 0.4590 - accuracy: 0.8095
Epoch 442/500
6/6 - 0s - loss: 0.5159 - accuracy: 0.7917
Epoch 443/500
6/6 - 0s - loss: 1.0400 - accuracy: 0.7321
Epoch 444/500
6/6 - 0s - loss: 0.8215 - accuracy: 0.7679
Epoch 445/500
6/6 - 0s - loss: 0.4529 - accuracy: 0.8571
Epoch 446/500
6/6 - 0s - loss: 0.4551 - accuracy: 0.8214
Epoch 447/500
6/6 - 0s - loss: 1.3074 - accuracy: 0.7619
Epoch 448/500
6/6 - 0s - loss: 1.7789 - accuracy: 0.7024
Epoch 449/500
6/6 - 0s - loss: 1.5543 - accuracy: 0.7440
Epoch 450/500
6/6 - 0s - loss: 0.7703 - accuracy: 0.7679
Epoch 451/500
6/6 - 0s - loss: 0.6668 - accuracy: 0.7857
Epoch 452/500
6/6 - 0s - loss: 0.4969 - accuracy: 0.7917
Epoch 453/500
6/6 - 0s - loss: 0.7541 - accuracy: 0.7560
Epoch 454/500
6/6 - 0s - loss: 0.6923 - accuracy: 0.7619
Epoch 455/500
6/6 - 0s - loss: 0.4974 - accuracy: 0.7798
Epoch 456/500
6/6 - 0s - loss: 0.9360 - accuracy: 0.7202
Epoch 457/500
6/6 - 0s - loss: 0.8772 - accuracy: 0.7262
Epoch 458/500
6/6 - 0s - loss: 0.5241 - accuracy: 0.7976
Epoch 459/500
6/6 - 0s - loss: 0.6441 - accuracy: 0.8452
Epoch 460/500
6/6 - 0s - loss: 0.8062 - accuracy: 0.7798
Epoch 461/500
6/6 - 0s - loss: 0.5682 - accuracy: 0.7738
Epoch 462/500
6/6 - 0s - loss: 1.2154 - accuracy: 0.7083
Epoch 463/500
6/6 - 0s - loss: 0.9338 - accuracy: 0.7440
Epoch 464/500
6/6 - 0s - loss: 0.5318 - accuracy: 0.7798
Epoch 465/500
6/6 - 0s - loss: 0.4244 - accuracy: 0.7679
Epoch 466/500
6/6 - 0s - loss: 0.7675 - accuracy: 0.7857
Epoch 467/500
6/6 - 0s - loss: 1.0929 - accuracy: 0.7024
Epoch 468/500
6/6 - 0s - loss: 2.7063 - accuracy: 0.6071
Epoch 469/500
6/6 - 0s - loss: 3.1285 - accuracy: 0.5714
Epoch 470/500
6/6 - 0s - loss: 1.4364 - accuracy: 0.7262
Epoch 471/500
6/6 - 0s - loss: 1.3276 - accuracy: 0.7143
Epoch 472/500
6/6 - 0s - loss: 0.5895 - accuracy: 0.8274
Epoch 473/500
6/6 - 0s - loss: 0.7874 - accuracy: 0.7202
Epoch 474/500
6/6 - 0s - loss: 0.8847 - accuracy: 0.7560
Epoch 475/500
6/6 - 0s - loss: 2.4059 - accuracy: 0.6190
Epoch 476/500
6/6 - 0s - loss: 0.5856 - accuracy: 0.7976
Epoch 477/500
6/6 - 0s - loss: 1.6138 - accuracy: 0.6726
Epoch 478/500
6/6 - 0s - loss: 3.6635 - accuracy: 0.6190
Epoch 479/500
6/6 - 0s - loss: 1.5387 - accuracy: 0.6786
Epoch 480/500
6/6 - 0s - loss: 1.5804 - accuracy: 0.7202
Epoch 481/500
6/6 - 0s - loss: 1.1936 - accuracy: 0.7500
Epoch 482/500
6/6 - 0s - loss: 0.5385 - accuracy: 0.8274
Epoch 483/500
6/6 - 0s - loss: 0.5614 - accuracy: 0.7679
Epoch 484/500
6/6 - 0s - loss: 0.8756 - accuracy: 0.7500
Epoch 485/500
6/6 - 0s - loss: 0.6093 - accuracy: 0.7560
Epoch 486/500
6/6 - 0s - loss: 1.9588 - accuracy: 0.6310
Epoch 487/500
6/6 - 0s - loss: 1.7198 - accuracy: 0.7083
Epoch 488/500
6/6 - 0s - loss: 0.9304 - accuracy: 0.7321
Epoch 489/500
6/6 - 0s - loss: 0.5345 - accuracy: 0.8095
Epoch 490/500
6/6 - 0s - loss: 2.6957 - accuracy: 0.5893
Epoch 491/500
6/6 - 0s - loss: 2.6077 - accuracy: 0.6607
Epoch 492/500
6/6 - 0s - loss: 4.0575 - accuracy: 0.5774
Epoch 493/500
6/6 - 0s - loss: 3.4324 - accuracy: 0.5655
Epoch 494/500
6/6 - 0s - loss: 3.7296 - accuracy: 0.6131
Epoch 495/500
6/6 - 0s - loss: 2.2588 - accuracy: 0.7262
Epoch 496/500
6/6 - 0s - loss: 3.1585 - accuracy: 0.6250
Epoch 497/500
6/6 - 0s - loss: 2.1257 - accuracy: 0.6905
Epoch 498/500
6/6 - 0s - loss: 1.3597 - accuracy: 0.7381
Epoch 499/500
6/6 - 0s - loss: 0.5437 - accuracy: 0.8155
Epoch 500/500
6/6 - 0s - loss: 0.4785 - accuracy: 0.7976
1/6 [====>.........................] - ETA: 2s - loss: 0.0759 - accuracy: 1.0000
6/6 [==============================] - 1s 3ms/step - loss: 0.5537 - accuracy: 0.8333
Accuracy: 83.33
[10, 1000, 0] => 0 (expected 0)
[14, 1004, 0] => 0 (expected 0)
[18, 1008, 0] => 0 (expected 0)
[22, 1012, 0] => 0 (expected 0)
[26, 1016, 0] => 0 (expected 0)
[30, 1020, 0] => 0 (expected 0)
[13, 1003, 1] => 0 (expected 0)
[17, 1007, 1] => 0 (expected 0)
[21, 1011, 1] => 0 (expected 0)
[25, 1015, 1] => 0 (expected 0)
[29, 1019, 1] => 0 (expected 0)
[12, 9002, 0] => 1 (expected 0)
[16, 9006, 0] => 1 (expected 0)
[20, 9010, 0] => 1 (expected 0)
[24, 9014, 0] => 1 (expected 0)
[28, 9018, 0] => 1 (expected 0)
[11, 9022, 1] => 1 (expected 1)
[15, 9026, 1] => 1 (expected 1)
[19, 9030, 1] => 1 (expected 1)
[23, 9034, 1] => 1 (expected 1)
[27, 9038, 1] => 1 (expected 1)
[70, 1000, 0] => 1 (expected 1)
[74, 1004, 0] => 1 (expected 1)
[78, 1008, 0] => 1 (expected 1)
[82, 1012, 0] => 1 (expected 1)
[86, 1016, 0] => 1 (expected 1)
[90, 1020, 0] => 1 (expected 1)
[73, 1003, 1] => 0 (expected 0)
[77, 1007, 1] => 0 (expected 0)
[81, 1011, 1] => 0 (expected 0)
[85, 1015, 1] => 1 (expected 0)
[89, 1019, 1] => 1 (expected 0)
[72, 9002, 0] => 1 (expected 1)
[76, 9006, 0] => 1 (expected 1)
[80, 9010, 0] => 1 (expected 1)
[84, 9014, 0] => 1 (expected 1)
[88, 9018, 0] => 1 (expected 1)
[71, 9001, 1] => 1 (expected 1)
[75, 9005, 1] => 1 (expected 1)
[79, 9009, 1] => 1 (expected 1)
Model: "sequential"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense (Dense)                (None, 12)                48        
_________________________________________________________________
dense_1 (Dense)              (None, 6)                 78        
_________________________________________________________________
dense_2 (Dense)              (None, 1)                 7         
=================================================================
Total params: 133
Trainable params: 133
Non-trainable params: 0
_________________________________________________________________
None

非常感谢Kevin

可能是您没有足够的训练数据吗?通常你训练1000个数据点,而不仅仅是8个。如果是这样的话,训练后的网络权重将与训练前的非常相似。

不过,看起来你的损失确实在一个时代比一个时代减少,所以我认为如果你只为更多的时代训练,你应该表现得很好。这将有效地完成与添加更多训练数据相同的事情,假设你无法想出更多的功能和动作组合。如果可以的话,在增加划时代的数量之前,一定要做到这一点。

最新更新