GridSearchCV:评分不使用所选的XGB回归或评分方法



Scikit learn GridSearchCV用于XGBRgressor模型的超参数调整。与XGBRgressor((.fit((中指定的eval_metric无关,GridSearchCV会生成相同的分值。在…上https://scikit-learn.org/stable/modules/generated/sklearn.model_selection.GridSearchCV.html它对参数评分说:"如果没有,则使用估计器的评分方法。"这不会发生。总是得到相同的值如何获得XGBRgressoreval_metric对应的结果

此示例代码:

import numpy as np
from sklearn.model_selection import GridSearchCV, KFold
from sklearn.datasets import load_boston
import xgboost as xgb
rng = np.random.RandomState(31337)
boston = load_boston()
y = boston['target']
X = boston['data']
kf = KFold(n_splits=2, random_state=42)
folds = list(kf.split(X))
xgb_model = xgb.XGBRegressor(objective='reg:squarederror', verbose=False)
reg = GridSearchCV(estimator=xgb_model, 
param_grid= {'max_depth': [2], 'n_estimators': [50]}, 
cv=folds,
verbose=False)
reg.fit(X, y, **{'eval_metric': 'mae', 'verbose': False})
print('GridSearchCV mean(mae)?:  ', reg.cv_results_['mean_test_score'])
# -----------------------------------------------
reg.fit(X, y, **{'eval_metric': 'rmse', 'verbose': False})
print('GridSearchCV mean(rmse)?: ', reg.cv_results_['mean_test_score'])
print("----------------------------------------------------")
xgb_model.set_params(**{'max_depth': 2, 'n_estimators': 50})
xgb_model.fit(X[folds[0][0],:],y[folds[0][0]], eval_metric='mae', 
eval_set = [(X[folds[0][0],:],y[folds[0][0]])], verbose=False)
print('XGBRegressor 0-mae:', xgb_model.evals_result()['validation_0']['mae'][-1])
xgb_model.fit(X[folds[0][1],:],y[folds[0][1]], eval_metric='mae', 
eval_set = [(X[folds[0][1],:],y[folds[0][1]])], verbose=False)
print('XGBRegressor 1-mae:', xgb_model.evals_result()['validation_0']['mae'][-1])
xgb_model.fit(X[folds[0][0],:],y[folds[0][0]], eval_metric='rmse', 
eval_set = [(X[folds[0][0],:],y[folds[0][0]])], verbose=False)
print('XGBRegressor 0-rmse:', xgb_model.evals_result()['validation_0']['rmse'][-1])
xgb_model.fit(X[folds[0][1],:],y[folds[0][1]], eval_metric='rmse', 
eval_set = [(X[folds[0][1],:],y[folds[0][1]])], verbose=False)
print('XGBRegressor 1-rmse:', xgb_model.evals_result()['validation_0']['rmse'][-1])

返回(线上方的数字应该是线下方数字的平均值(

GridSearchCV mean(mae)?:   [0.70941007]
GridSearchCV mean(rmse)?:  [0.70941007]
----------------------------------------------------
XGBRegressor 0-mae: 1.273626
XGBRegressor 1-mae: 1.004947
XGBRegressor 0-rmse: 1.647694
XGBRegressor 1-rmse: 1.290872

TL;DR:你得到的是所谓的R2或决定系数。这是XGBRegressorscore函数的默认评分指标,如果scoring=None,则由GridSearchCV获取

将结果与显式编码scoring:进行比较

from sklearn.metrics import make_scorer, r2_score, mean_squared_error
xgb_model = xgb.XGBRegressor(objective='reg:squarederror', verbose=False)
reg = GridSearchCV(estimator=xgb_model, scoring=make_scorer(r2_score),
param_grid= {'max_depth': [2], 'n_estimators': [50]}, 
cv=folds,
verbose=False)
reg.fit(X, y)
reg.best_score_
0.7333542105472226

scoring=None:

reg = GridSearchCV(estimator=xgb_model, scoring=None,
param_grid= {'max_depth': [2], 'n_estimators': [50]}, 
cv=folds,
verbose=False)
reg.fit(X, y)
reg.best_score_
0.7333542105472226

如果您阅读GridSearchCV文档字符串:

估计器:估计器对象。这被假定为实现scikit学习估计器接口。任一估计器都需要提供CCD_ 9函数,或者必须通过CCD_ 10。

此时您需要检查文档中的xgb_model.score?:

签名:xgb_model.score(X,y,sample_weight=None(
文档字符串:
返回预测的决定系数R^2。

因此,在这些文档的帮助下,如果您不喜欢XGBRegressor的默认R2评分函数,请将您的评分函数明确提供给GridSearchCV

例如,如果你想要RMSE,你可以做:

reg = GridSearchCV(estimator=xgb_model,  
scoring=make_scorer(mean_squared_error, squared=False),
param_grid= {'max_depth': [2], 'n_estimators': [50]}, 
cv=folds,
verbose=False)
reg.fit(X, y)
reg.best_score_
4.618242594168436

最新更新