Tensorflow精度评分



我正在创建一个回归形式的机器学习模型。我从XGBoost开始获得我的第一个估计。然而,这些还不够令人信服(使用XGB回归因子,我只得到0.60的R2分数)。

所以我开始寻找神经网络的解决方案,最终使用了tensorflow。然而,我对这个模块比较陌生,想知道是否有相当于xgboost.score的?

第一个代码是使用xgboost,我现在正在第二个代码上工作,使用tensorflow。

xgb = XGBRegressor(learning_rate = 0.30012, max_depth = 5, n_estimators = 180, subsample = 0.7, colsample_bylevel = 0.7, colsample_bytree = 0.7, min_child_weight = 4, reg_alpha = 10, reg_lambda = 10)
xgb.fit(X_train, y_train)
print("Score on train data : " + str(xgb.score(X_train, y_train)))
print("Score on validation data : " + str(xgb.score(X_val, y_val)))

第二个使用TensorFlow:

tf.random.set_seed(123)  #first we set random seed
model = tf.keras.Sequential([
tf.keras.layers.Dense(100, activation = tf.keras.activations.relu),
tf.keras.layers.Dense(10),
tf.keras.layers.Dense(1)
])
model.compile( loss = tf.keras.losses.mae, #mae stands for mean absolute error
optimizer = tf.keras.optimizers.SGD(), #stochastic GD
metrics = ['mae'])
model.fit( X_train, y_train, epochs = 100)

我如何使用R2评分评估我的tensorflow模型?

xgb.score返回R2分数,你可以在tensorflow中从头开始实现这个指标,

def R_squared(y, y_pred):
residual = tf.reduce_sum(tf.square(tf.subtract(y, y_pred)))
total = tf.reduce_sum(tf.square(tf.subtract(y, tf.reduce_mean(y))))
r2 = tf.subtract(1.0, tf.div(residual, total))
return r2

在编译模型时,在metrics参数中传递此函数,以便在评估和训练模型时显示,如下所示:

model.compile( loss = tf.keras.losses.mae, #mae stands for mean absolute error
optimizer = tf.keras.optimizers.SGD(), #stochastic GD
metrics = ['mae', R_squared])

当您使用model.evaluate评估模型时,如下所示,R2分数将出现,

y_pred = model.predict(X_val)
model.evaluate(y_pred, y_val)

您需要了解平均绝对误差,然后您可以从其函数中获得目标返回值。

它的准确性意味着你只需要一个训练和测试步骤。参考你需要了解的分类器和回归的函数。

[条件]:

# [ Score ] : score – Mean accuracy of self.predict(X) wrt. y.
# https://xgboost.readthedocs.io/en/stable/python/python_api.html?highlight=xgboost.score#xgboost.XGBClassifier.score
# Return the mean accuracy on the given test data and labels.
# https://xgboost.readthedocs.io/en/stable/python/python_api.html?highlight=xgboost.score#xgboost.XGBRFRegressor.score
# Coefficients are defined only for linear learners
# Coefficients are only defined when the linear model is chosen as base learner (booster=gblinear). 
# It is not defined for other base learner types, such as tree learners (booster=gbtree).
# https://xgboost.readthedocs.io/en/stable/search.html?q=xgboost.score&check_keywords=yes&area=default
# [py:method]: xgboost.XGBClassifier.score
#       Return the mean accuracy on the given test data and labels. ...
# [py:method]: xgboost.XGBRFClassifier.score
#       Return the mean accuracy on the given test data and labels. ...
# [py:method]: xgboost.XGBRFRegressor.score
#       Return the coefficient of determination of the prediction. Notes The (R^2) score used when calling ...

[Tensorflows期望]:

# https://stackoverflow.com/questions/72123313/tensorflow-accuracy-score
# Tensorflow accuracy score

[示例]:

import os
from os.path import exists
import tensorflow as tf
import tensorflow_io as tfio
import matplotlib.pyplot as plt
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""
[PhysicalDevice(name='/physical_device:GPU:0', device_type='GPU')]
None
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""
physical_devices = tf.config.experimental.list_physical_devices('GPU')
assert len(physical_devices) > 0, "Not enough GPU hardware devices available"
config = tf.config.experimental.set_memory_growth(physical_devices[0], True)
print(physical_devices)
print(config)
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""
Variables
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""
PATH = os.path.join('F:\datasets\downloads\Actors\train\Pikaploy', '*.tif')
PATH_2 = os.path.join('F:\datasets\downloads\Actors\train\Candidt Kibt', '*.tif')
files = tf.data.Dataset.list_files(PATH)
files_2 = tf.data.Dataset.list_files(PATH_2)
list_file = []
list_file_actual = []
list_label = []
list_label_actual = [ 'Pikaploy', 'Pikaploy', 'Pikaploy', 'Pikaploy', 'Pikaploy', 'Candidt Kibt', 'Candidt Kibt', 'Candidt Kibt', 'Candidt Kibt', 'Candidt Kibt' ]
for file in files.take(5):
image = tf.io.read_file( file )
image = tfio.experimental.image.decode_tiff(image, index=0)
list_file_actual.append(image)
image = tf.image.resize(image, [32,32], method='nearest')
list_file.append(image)
list_label.append(1)

for file in files_2.take(5):
image = tf.io.read_file( file )
image = tfio.experimental.image.decode_tiff(image, index=0)
list_file_actual.append(image)
image = tf.image.resize(image, [32,32], method='nearest')
list_file.append(image)
list_label.append(9)
checkpoint_path = "F:\models\checkpoint\" + os.path.basename(__file__).split('.')[0] + "\TF_DataSets_01.h5"
checkpoint_dir = os.path.dirname(checkpoint_path)
loggings = "F:\models\checkpoint\" + os.path.basename(__file__).split('.')[0] + "\loggings.log"
if not exists(checkpoint_dir) : 
os.mkdir(checkpoint_dir)
print("Create directory: " + checkpoint_dir)

log_dir = checkpoint_dir
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""
DataSet
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""
dataset = tf.data.Dataset.from_tensor_slices((tf.constant(tf.cast(list_file, dtype=tf.int64), shape=(10, 1, 32, 32, 4), dtype=tf.int64),tf.constant(list_label, shape=(10, 1, 1), dtype=tf.int64)))
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""
: Callback
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""
class custom_callback(tf.keras.callbacks.Callback):

def on_train_end(self, logs=None):
print( "ntrain mae: " + str( logs['mae'] ) )

def on_test_end(self, logs=None):
print( "nevaluation mae: " + str( logs['mae'] ) )
custom_callback = custom_callback()
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""
: Model Initialize
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""
model = tf.keras.models.Sequential([
tf.keras.layers.InputLayer(input_shape=( 32, 32, 4 )),
tf.keras.layers.Normalization(mean=3., variance=2.),
tf.keras.layers.Normalization(mean=4., variance=6.),
tf.keras.layers.Conv2D(32, (3, 3), activation='relu'),
tf.keras.layers.MaxPooling2D((2, 2)),
tf.keras.layers.Dense(128, activation='relu'),
tf.keras.layers.Reshape((128, 225)),
tf.keras.layers.Bidirectional(tf.keras.layers.LSTM(96, return_sequences=True, return_state=False)),
tf.keras.layers.Bidirectional(tf.keras.layers.LSTM(96)),
tf.keras.layers.Flatten(),
tf.keras.layers.Dense(192, activation='relu'),
tf.keras.layers.Dense(10),
])
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""
: Optimizer
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""
optimizer = tf.keras.optimizers.SGD( learning_rate=0.01, momentum=0.1, )
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""
: Loss Fn
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""                               
lossfn = tf.keras.losses.mae
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""
: Model Summary
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""
model.compile(optimizer=optimizer, loss=lossfn, metrics=['mae'])
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""
: FileWriter
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""
if exists(checkpoint_path) :
model.load_weights(checkpoint_path)
print("model load: " + checkpoint_path)
input("Press Any Key!")
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""
: Training
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""
history = model.fit( dataset, validation_data=(dataset), validation_steps=1, batch_size=100, epochs=50, callbacks=[custom_callback] )
model.save_weights(checkpoint_path)
# [ Score ] : score – Mean accuracy of self.predict(X) wrt. y.
# https://xgboost.readthedocs.io/en/stable/python/python_api.html?highlight=xgboost.score#xgboost.XGBClassifier.score
# Return the mean accuracy on the given test data and labels.

result = model.evaluate(
dataset, batch_size=100, verbose=0, callbacks=[custom_callback]
)
plt.figure(figsize=(5,2))
plt.title("Actors recognitions")
for i in range(len(list_file)):
img = tf.keras.preprocessing.image.array_to_img(
list_file[i],
data_format=None,
scale=True
)
img_array = tf.keras.preprocessing.image.img_to_array(img)
img_array = tf.expand_dims(img_array, 0)
predictions = model.predict(img_array, callbacks=[custom_callback], verbose=1)

score = tf.nn.softmax(predictions[0])
plt.subplot(5, 2, i + 1)
plt.xticks([])
plt.yticks([])
plt.grid(False)
plt.imshow(list_file_actual[i])
plt.xlabel(str(round(score[tf.math.argmax(score).numpy()].numpy(), 2)) + ":" +  str(list_label_actual[tf.math.argmax(score)]))

plt.show()
input('...')

[Output]:Mae是指精度值。

train mae: 3.8480560779571533
evaluation mae: 3.8665459156036377

[Result]:示例

相关内容

  • 没有找到相关文章

最新更新