我正在使用XGBoost进行销售预测。我需要一个自定义目标函数,因为预测的值取决于商品的销售价格。我正在努力将销售价格输入到标签和预测旁边的损失函数中。这是我的方法:
def monetary_value_objective(predt: np.ndarray, dtrain: Union[xgb.DMatrix, np.ndarray]) -> Tuple[np.ndarray, np.ndarray]:
"""
predt = model prediction
dtrain = labels
Currently, dtrain is a numpy array.
"""
y = dtrain
mask1 = predt <= y # Predict too few
mask2 = predt > y # Predict too much
price = train[0]["salesPrice"]
grad = price **2 * (predt - y)
# Gradient is negative if prediction is too low, and positive if it is too high
# Here scale it (0.72 = 0.6**2 * 2)
grad[mask1] = 2 * grad[mask1]
grad[mask2] = 0.72 * grad[mask2]
hess = np.empty_like(grad)
hess[mask1] = 2 * price[mask1]**2
hess[mask2] = 0.72 * price[mask2]**2
grad = -grad
return grad, hess
超参数调整时出现以下错误:
[09:11:35] WARNING: /workspace/src/objective/regression_obj.cu:152: reg:linear is now deprecated in favor of reg:squarederror.
0%| | 0/1 [00:00<?, ?it/s, best loss: ?]
---------------------------------------------------------------------------
IndexError Traceback (most recent call last)
<ipython-input-34-2c64dc1b5a76> in <module>()
1 # set runtime environment to GPU at: Runtime -> Change runtime type
----> 2 trials, best_hyperparams = hyperpara_tuning(para_space)
3 final_xgb_model = trials.best_trial['result']['model']
4 assert final_xgb_model is not None, "Oooops there is no model created :O "
5
17 frames
/usr/local/lib/python3.6/dist-packages/pandas/core/indexers.py in check_array_indexer(array, indexer)
399 if len(indexer) != len(array):
400 raise IndexError(
--> 401 f"Boolean index has wrong length: "
402 f"{len(indexer)} instead of {len(array)}"
403 )
IndexError: Boolean index has wrong length: 1 instead of 136019
有没有人知道如何在目标函数中使用销售价格?这可能吗?
谢谢!
有点晚了,但这回答了OP,https://datascience.stackexchange.com/questions/74780/how-to-implement-custom-loss-function-that-has-more-parameters-with-xgbclassifie
您可以使用函数返回保留相同回调签名的函数,但回调可以使用父函数的数据。
您可以在自定义目标函数中使用weights
向量,如果您将外部变量编码为权重分布,它可以工作,但我不知道权重本身是否仅用于目标函数本身或也可能用于数据采样级别,如果是这样,您将获得更复杂的情况......
您可以使用闭包从环境中传递所需的值,同时保留所需的目标函数签名。下面是一个 R 示例,
#Define 闭合
objectiveShell<-function(original.d){
myobjective <- function(preds, dtrain)
{
extradata = original.d$some_additional_data
labels <- getinfo(dtrain, "label")
grad <- (preds-labels) + extradata
hess <- rep(1, length(labels))
return(list(grad = grad, hess = hess))
}
}
# Model Parameter
param1 <- list(booster = 'gbtree'
, learning_rate = 0.1
# This is how you use the closure
, objective = objectiveShell(DESIRED_DATA_FRAME)
, eval_metric = evalerror
, set.seed = 2020)
# Train Model
xgb1 <- xgb.train(params = param1
, data = dtrain
, nrounds = 5
, watchlist
, maximize = FALSE
, early_stopping_rounds = 5)
一个 python 示例可以在这里找到: https://datascience.stackexchange.com/questions/74780/how-to-implement-custom-loss-function-that-has-more-parameters-with-xgbclassifie