我正在尝试使用scikit learn 0.15.1版本中的SGDClassifier。除了迭代次数之外,似乎没有任何方法可以设置收敛标准。因此,我想通过在每次迭代中检查错误来手动完成这项工作,然后启动额外的迭代,直到改进足够小。
不幸的是,warm_start标志和coeff_init/intercept_init似乎都不是真正的热启动优化——它们似乎都是从头开始的。
我该怎么办?如果没有真正的收敛标准或热启动,分类器是不可用的。
请注意下面的偏差是如何在每次重新启动时增加很多的,以及损失是如何增加但随着进一步迭代而减少的。在250次迭代之后,偏差为-3.44,平均损耗为1.46。
sgd = SGDClassifier(loss='log', alpha=alpha, verbose=1, shuffle=True,
warm_start=True)
print('INITIAL FIT')
sgd.fit(X, y, sample_weight=sample_weight)
sgd.n_iter = 1
print('nONE MORE ITERATION')
sgd.fit(X, y, sample_weight=sample_weight)
sgd.n_iter = 3
print('nTHREE MORE ITERATIONS')
sgd.fit(X, y, sample_weight=sample_weight)
INITIAL FIT
-- Epoch 1
Norm: 254.11, NNZs: 92299, Bias: -5.239955, T: 122956, Avg. loss: 28.103236
Total training time: 0.04 seconds.
-- Epoch 2
Norm: 138.81, NNZs: 92598, Bias: -5.180938, T: 245912, Avg. loss: 16.420537
Total training time: 0.08 seconds.
-- Epoch 3
Norm: 100.61, NNZs: 92598, Bias: -5.082776, T: 368868, Avg. loss: 12.240537
Total training time: 0.12 seconds.
-- Epoch 4
Norm: 74.18, NNZs: 92598, Bias: -5.076395, T: 491824, Avg. loss: 9.859404
Total training time: 0.17 seconds.
-- Epoch 5
Norm: 55.57, NNZs: 92598, Bias: -5.072369, T: 614780, Avg. loss: 8.280854
Total training time: 0.21 seconds.
ONE MORE ITERATION
-- Epoch 1
Norm: 243.07, NNZs: 92598, Bias: -11.271497, T: 122956, Avg. loss: 26.148746
Total training time: 0.04 seconds.
THREE MORE ITERATIONS
-- Epoch 1
Norm: 258.70, NNZs: 92598, Bias: -16.058395, T: 122956, Avg. loss: 29.666688
Total training time: 0.04 seconds.
-- Epoch 2
Norm: 142.24, NNZs: 92598, Bias: -15.809559, T: 245912, Avg. loss: 17.435114
Total training time: 0.08 seconds.
-- Epoch 3
Norm: 102.71, NNZs: 92598, Bias: -15.715853, T: 368868, Avg. loss: 12.731181
Total training time: 0.12 seconds.
warm_start=True
将使用拟合系数作为起点,但它会重新启动学习率计划。
如果你想手动检查收敛性,我建议你使用partial_fit
,而不是@AdrienNK建议的fit
:
sgd = SGDClassifier(loss='log', alpha=alpha, verbose=1, shuffle=True,
warm_start=True, n_iter=1)
sgd.partial_fit(X, y)
# after 1st iteration
sgd.partial_fit(X, y)
# after 2nd iteration
...