>我试图在梯度提升分类上绘制错误,但我似乎找不到我的错误。我在网站上查找了类似的主题,但没有找到任何令人满意的答案。这是我的代码,希望你们能够提供帮助:
import time
tableau_duree_grd = np.zeros(145)
tableau_erreur_grd = np.zeros(145)
for b in range(5,150):
start_time=time.time()
grd=GradientBoostingClassifier(n_estimators=b,validation_fraction= 0.1,n_iter_no_change=10,learning_rate=0.1,max_features=None)
grd.fit(XTrainD,YTrainD)
pred = grd.predict(XTestD)
test_erreur_grd = np.mean(YTestD!=pred)
end_time=time.time()
duree=end_time-start_time
tableau_duree_grd[b-5]=duree
tableau_erreur_grd[b-5]=test_erreur_grd
完整的错误回溯:
IndexError Traceback (most recent call last)
<ipython-input-46-9978ac0dd8ba> in <module>
6 start_time=time.time()
7 grd=GradientBoostingClassifier(n_estimators=b,validation_fraction= 0.1, n_iter_no_change=10,learning_rate=0.1,max_features=None)
----> 8 grd.fit(XTrainD,YTrainD)
9 pred = grd.predict(XTestD)
10 test_erreur_grd = np.mean(YTestD!=pred)
~AppDataLocalContinuumanaconda3libsite-packagessklearnensemblegradient_boosting.py in fit(self, X, y, sample_weight, monitor)
1463 n_stages = self._fit_stages(X, y, y_pred, sample_weight, self._rng,
1464 X_val, y_val, sample_weight_val,
-> 1465 begin_at_stage, monitor, X_idx_sorted)
1466
1467 # change shape of arrays after fit (early-stopping or additional ests)
~AppDataLocalContinuumanaconda3libsite-packagessklearnensemblegradient_boosting.py in _fit_stages(self, X, y, y_pred, sample_weight, random_state, X_val, y_val, sample_weight_val, begin_at_stage, monitor, X_idx_sorted)
1527 y_pred = self._fit_stage(i, X, y, y_pred, sample_weight,
1528 sample_mask, random_state, X_idx_sorted,
-> 1529 X_csc, X_csr)
1530
1531 # track deviance (= loss)
~AppDataLocalContinuumanaconda3libsite-packagessklearnensemblegradient_boosting.py in _fit_stage(self, i, X, y, y_pred, sample_weight, sample_mask, random_state, X_idx_sorted, X_csc, X_csr)
1169
1170 residual = loss.negative_gradient(y, y_pred, k=k,
-> 1171 sample_weight=sample_weight)
1172
1173 # induce regression tree on residuals
~AppDataLocalContinuumanaconda3libsite-packagessklearnensemblegradient_boosting.py in negative_gradient(self, y, pred, k, **kwargs)
914 The index of the class
915 """
--> 916 return y - np.nan_to_num(np.exp(pred[:, k] -
917 logsumexp(pred, axis=1)))
918
IndexError: index 12 is out of bounds for axis 1 with size 12
我不知道该软件,但这听起来像是一个错误。编程语言通常从 0 开始计算索引,而不是从 1 开始计算索引。
更具体地说,在gradient_boosting.py的第 916 行,"k"变量可能包含值 12,但应该使用值 11。
还可以通过在gradient_boosting.py的第 916 行周围添加 print 语句来获取更多信息,以便更好地了解发生错误时逻辑中发生的情况。