在KERAS中的分类预测中返回概率



我正在尝试做出一个简单的概念验证,我可以看到给定预测的不同类别的概率。

但是,即使我使用的是软磁性激活,我尝试的一切似乎都只能输出预测的类。我是机器学习的新手,因此我不确定我是否犯了一个简单的错误,或者这是Keras中不可用的功能。

我正在使用keras tensorflow。我已经改编了Keras给出的一个基本示例之一,用于分类MNIST数据集。

我下面的代码与示例完全相同,除了将模型导出到本地文件的几行(注释)的额外行。

'''Trains a simple deep NN on the MNIST dataset.
Gets to 98.40% test accuracy after 20 epochs
(there is *a lot* of margin for parameter tuning).
2 seconds per epoch on a K520 GPU.
'''
from __future__ import print_function
import keras
from keras.datasets import mnist
from keras.models import Sequential
from keras.layers import Dense, Dropout
from keras.optimizers import RMSprop
import h5py # added import because it is required for model.save
model_filepath = 'test_model.h5' # added filepath config
batch_size = 128
num_classes = 10
epochs = 20
# the data, shuffled and split between train and test sets
(x_train, y_train), (x_test, y_test) = mnist.load_data()
x_train = x_train.reshape(60000, 784)
x_test = x_test.reshape(10000, 784)
x_train = x_train.astype('float32')
x_test = x_test.astype('float32')
x_train /= 255
x_test /= 255
print(x_train.shape[0], 'train samples')
print(x_test.shape[0], 'test samples')
# convert class vectors to binary class matrices
y_train = keras.utils.to_categorical(y_train, num_classes)
y_test = keras.utils.to_categorical(y_test, num_classes)
model = Sequential()
model.add(Dense(512, activation='relu', input_shape=(784,)))
model.add(Dropout(0.2))
model.add(Dense(512, activation='relu'))
model.add(Dropout(0.2))
model.add(Dense(num_classes, activation='softmax'))
model.summary()
model.compile(loss='categorical_crossentropy',
          optimizer=RMSprop(),
          metrics=['accuracy'])
history = model.fit(x_train, y_train,
                batch_size=batch_size,
                epochs=epochs,
                verbose=1,
                validation_data=(x_test, y_test))
score = model.evaluate(x_test, y_test, verbose=0)
print('Test loss:', score[0])
print('Test accuracy:', score[1])
model.save(model_filepath) # added saving model
print('Model saved') # added log

然后,第二部分是一个简单的脚本,应该导入模型,预测某些给定数据的类,并打印每个类的概率。(我正在使用KERAS代码库中包含的同一MNIST类来使一个示例尽可能简单)。

import keras
from keras.datasets import mnist
from keras.models import Sequential
import keras.backend as K
import numpy
# loading model saved locally in test_model.h5
model_filepath = 'test_model.h5'
prev_model = keras.models.load_model(model_filepath)
# these lines are copied from the example for loading MNIST data
(x_train, y_train), (x_test, y_test) = mnist.load_data()
x_train = x_train.reshape(60000, 784)
# for this example, I am only taking the first 10 images
x_slice = x_train[slice(1, 11, 1)]
# making the prediction
prediction = prev_model.predict(x_slice)
# logging each on a separate line
for single_prediction in prediction:
    print(single_prediction)

如果我运行第一个脚本来导出模型,则第二个脚本来对一些示例进行分类,我将获得以下输出:

[ 1.  0.  0.  0.  0.  0.  0.  0.  0.  0.]
[ 0.  0.  0.  0.  1.  0.  0.  0.  0.  0.]
[ 0.  1.  0.  0.  0.  0.  0.  0.  0.  0.]
[ 0.  0.  0.  0.  0.  0.  0.  0.  0.  1.]
[ 0.  0.  1.  0.  0.  0.  0.  0.  0.  0.]
[ 0.  1.  0.  0.  0.  0.  0.  0.  0.  0.]
[ 0.  0.  0.  1.  0.  0.  0.  0.  0.  0.]
[ 0.  1.  0.  0.  0.  0.  0.  0.  0.  0.]
[ 0.  0.  0.  0.  1.  0.  0.  0.  0.  0.]
[ 0.  0.  0.  1.  0.  0.  0.  0.  0.  0.]

这非常适合查看每个类别的类别,但是如果我想在每个示例中看到每个类的相对概率,该怎么办?我正在寻找更多这样的东西:

[ 0.94 0.01 0.02 0. 0. 0.01 0. 0.01 0.01 0.]
[ 0. 0. 0. 0. 0.51 0. 0. 0. 0.49 0.]
...

换句话说,我需要知道每个预测是多么确定,而不仅仅是预测本身。我认为看到相对概率是使用模型中使用软马克斯激活的一部分,但是我似乎找不到KERAS文档中的任何东西,可以给我提供概率而不是预测的答案。我是在犯某种愚蠢的错误,还是此功能不可用?

,所以事实证明我没有在预测脚本中完全正常化数据。

我的预测脚本应该具有以下几行:

# these lines are copied from the example for loading MNIST data
(x_train, y_train), (x_test, y_test) = mnist.load_data()
x_train = x_train.reshape(60000, 784)
x_train = x_train.astype('float32') # this line was missing
x_train /= 255 # this line was missing too

由于数据没有浮动,并除以255(因此在0到1之间),所以它仅显示为1s和0s。

keras predict确实返回概率,而不是类。

无法通过我的系统配置重现您的问题:

Python version 2.7.12
Tensorflow version 1.3.0
Keras version 2.0.9
Numpy version 1.13.3

这是我使用加载模型对您的x_slice的预测输出(训练了20个时期,如您的代码中):

print(prev_model.predict(x_slice))
# Result: 
[[  1.00000000e+00   3.31656316e-37   1.07806675e-21   7.11765177e-30
    2.48000320e-31   5.34837679e-28   3.12470132e-24   4.65175406e-27
    8.66994134e-31   5.26426367e-24]
 [  0.00000000e+00   5.34361977e-30   3.91144999e-35   0.00000000e+00
    1.00000000e+00   0.00000000e+00   1.05583665e-36   1.01395577e-29
    0.00000000e+00   1.70868685e-29]
 [  3.99137559e-38   1.00000000e+00   1.76682222e-24   9.33333581e-31
    3.99846307e-15   1.17745576e-24   1.87529709e-26   2.18951752e-20
    3.57518280e-17   1.62027896e-28]
 [  6.48006586e-26   1.48974980e-17   5.60530329e-22   1.81973780e-14
    9.12573406e-10   1.95987500e-14   8.08566866e-27   1.17901132e-12
    7.33970447e-13   1.00000000e+00]
 [  2.01602060e-16   6.58242856e-14   1.00000000e+00   6.84244084e-09
    1.19809885e-16   7.94907624e-14   3.10690434e-19   8.02848586e-12
    4.68330721e-11   5.14736501e-15]
 [  2.31014903e-35   1.00000000e+00   6.02224725e-21   2.35928828e-23
    7.50006509e-15   4.06930881e-22   1.13288827e-24   4.20440718e-17
    4.95182972e-17   1.85492109e-18]
 [  0.00000000e+00   0.00000000e+00   0.00000000e+00   1.00000000e+00
    0.00000000e+00   6.30200370e-27   0.00000000e+00   5.19937755e-33
    1.63205659e-31   1.21508034e-20]
 [  1.44608573e-26   1.00000000e+00   1.78712268e-18   6.84598301e-19
    1.30042071e-11   2.53873986e-14   5.83169942e-17   1.20201071e-12
    2.21844570e-14   3.75015198e-15]
 [  0.00000000e+00   6.29184453e-34   9.22474943e-29   0.00000000e+00
    1.00000000e+00   3.05067233e-34   1.43097161e-28   1.34234082e-29
    4.28647272e-36   9.29760838e-34]
 [  4.68828449e-30   5.55172479e-20   3.26705529e-19   9.99999881e-01
    3.49577992e-22   1.27715460e-11   4.99185615e-36   1.19164204e-20
    4.21086124e-16   1.52631387e-07]]

我怀疑印刷时有一些四舍五入问题(或者您已经接受了更多时代的培训,并且您的训练概率已经接近1)...

要说服自己确实有概率而不是班级的预测,我建议尝试从训练的模型中获得预测;通常,您应该少看到1.0'S-在这里是model训练的CC_4:

print(model.predict(x_slice))
# Result: 
[[  9.99916673e-01   5.36548761e-08   6.10747229e-05   8.21199933e-07
    6.64725164e-08   6.78853041e-07   9.09637220e-06   4.56192402e-06
    1.62688798e-06   5.23997733e-06]
 [  7.59836894e-07   1.78043920e-05   1.79073555e-04   2.95592145e-05
    9.98031914e-01   1.75839632e-05   5.90557102e-06   1.27705920e-03
    3.94643757e-06   4.36416740e-04]
 [  4.48473330e-08   9.99895334e-01   2.82608235e-05   5.33154832e-07
    9.78453227e-06   1.58954310e-06   3.38150176e-06   5.26260410e-05
    8.09341054e-06   3.28643267e-07]
 [  7.38236849e-07   4.80247072e-05   2.81726116e-05   4.77648537e-05
    7.21933879e-03   2.52177160e-05   3.88786475e-07   3.56770557e-04
    2.83472677e-04   9.91990149e-01]
 [  5.03611082e-05   2.69402866e-04   9.92011130e-01   4.68175858e-03
    9.57477605e-05   4.26214538e-04   7.66683661e-05   7.05923303e-04
    1.45670515e-03   2.26032615e-04]
 [  1.36330849e-10   9.99994516e-01   7.69141934e-07   1.44130311e-07
    9.52201333e-07   1.45219332e-07   4.43408908e-07   6.93398249e-07
    2.18685204e-06   1.50741769e-07]
 [  2.39427478e-09   3.75754922e-07   3.89349816e-06   9.99889374e-01
    1.85837867e-09   1.16176770e-05   1.89989760e-11   3.12301523e-07
    1.13220040e-05   8.29571582e-05]
 [  1.45760115e-08   9.99900222e-01   3.67058942e-06   4.04857201e-06
    1.97999962e-05   7.85745397e-06   8.13850420e-06   1.87294081e-05
    2.81870762e-05   9.38157609e-06]
 [  7.52560858e-09   8.84437856e-09   9.71140025e-07   5.20911703e-10
    9.99986649e-01   3.12135370e-07   1.06521384e-05   1.25693066e-06
    7.21853368e-08   5.21001624e-08]
 [  8.67672298e-08   2.17907742e-04   2.45352840e-06   9.95455265e-01
    1.43749105e-06   1.51766278e-03   1.83744309e-08   3.83995541e-07
    9.90309782e-05   2.70584645e-03]]

最新更新