这个文档展示了XGBoost API训练的模型可以通过以下代码进行切片:
from sklearn.datasets import make_classification
import xgboost as xgb
booster = xgb.train({
'num_parallel_tree': 4, 'subsample': 0.5, 'num_class': 3},
num_boost_round=num_boost_round, dtrain=dtrain)
sliced: xgb.Booster = booster[3:7]
我试过了,它成功了。
由于XGBoost提供Scikit-Learn Wrapper接口,我尝试这样做:
from xgboost import XGBClassifier
clf_xgb = XGBClassifier().fit(X_train, y_train)
clf_xgb_sliced: clf_xgb.Booster = booster[3:7]
但是得到以下错误:
---------------------------------------------------------------------------
AttributeError Traceback (most recent call last)
<ipython-input-18-84155815d877> in <module>
----> 1 clf_xgb_sliced: clf_xgb.Booster = booster[3:7]
AttributeError: 'XGBClassifier' object has no attribute 'Booster'
由于XGBClassifier没有属性'Booster',是否有任何方法来切片Scikit-Learn Wrapper接口训练XGBClassifier(/XGBRegressor)模型?
问题在于您给clf_xgb.Booster
的类型提示与现有参数不匹配。试一试:
clf_xgb_sliced: xgb.Booster = clf_xgb.get_booster()[3:7]
。