对分割测试和训练数据集进行交叉验证



与标准数据不同,我的数据集分别包含train, test1和test2。我实现了ML算法并获得了性能指标。但是当我应用交叉验证时,它变得复杂了。也许有人能帮帮我…谢谢你. .

这是我的代码。

train = pd.read_csv('train-alldata.csv',sep=";")
test = pd.read_csv('test1-alldata.csv',sep=";")
test2 = pd.read_csv('test2-alldata.csv',sep=";")
X_train = train_pca_son.drop('churn_yn',axis=1)
y_train = train_pca_son['churn_yn']
X_test = test_pca_son.drop('churn_yn',axis=1)
y_test = test_pca_son['churn_yn']
X_test_2 = test2_pca_son.drop('churn_yn',axis=1)
y_test_2 = test2_pca_son['churn_yn']

例如KNN Classifier。

knn_classifier = KNeighborsClassifier(n_neighbors =7,metric='euclidean')
knn_classifier.fit(X_train, y_train)

K-Fold。

from sklearn import datasets
from sklearn.tree import DecisionTreeClassifier
from sklearn.model_selection import KFold, cross_val_score

dtc = DecisionTreeClassifier(random_state=42)
k_folds = KFold(n_splits = 5)
scores = cross_val_score(dtc, X, y, cv = k_folds)
print("Cross Validation Scores: ", scores)
print("Average CV Score: ", scores.mean())
print("Number of CV Scores used in Average: ", len(scores))

这是保持率测试数据"模式(参见:Wikipedia:术语中的训练、验证、测试/混淆)。对于流失预测:如果你有两种类型的客户,或者在两个时间框架上进行评估,可能会出现这种情况。

X_train, y_train    ← perform training and hyperparameter tuning with this
X_test1, y_test1    ← test on this
X_test2, y_test2    ← test on this as well

交叉验证使用训练数据估计滞留误差——如果你用GridSearchCV估计超参数,可能会出现这种情况。最终评估包括对两个测试集的性能进行评估,分别评估或对两个测试集进行平均评估:

from sklearn.datasets import make_classification
from sklearn.model_selection import train_test_split
from sklearn.neighbors import KNeighborsClassifier
from sklearn.metrics import f1_score
X, y = make_classification(n_samples=1000, random_state=42)
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=.4)
X_test1, X_test2, y_test1, y_test2 = train_test_split(X_test, y_test, test_size=.5)
print(y_train.shape, y_test1.shape, y_test2.shape)
# (600,) (200,) (200,)
clf = KNeighborsClassifier(n_neighbors=7).fit(X_train, y_train)
print(f1_score(y_test1, clf.predict(X_test1)))
print(f1_score(y_test2, clf.predict(X_test2)))
# 0.819
# 0.805

相关内容

最新更新