Scikit Learn中的多变量/多元线性回归



我在.csv文件中有一个数据集(dataTrain.csv和dataTest.csv(,格式如下:

Temperature(K),Pressure(ATM),CompressibilityFactor(Z)
273.1,24.675,0.806677258
313.1,24.675,0.888394713
...,...,...

并且能够使用以下代码构建回归模型和预测:

import pandas as pd
from sklearn import linear_model
dataTrain = pd.read_csv("dataTrain.csv")
dataTest = pd.read_csv("dataTest.csv")
# print df.head()
x_train = dataTrain['Temperature(K)'].reshape(-1,1)
y_train = dataTrain['CompressibilityFactor(Z)']
x_test = dataTest['Temperature(K)'].reshape(-1,1)
y_test = dataTest['CompressibilityFactor(Z)']
ols = linear_model.LinearRegression()
model = ols.fit(x_train, y_train)
print model.predict(x_test)[0:5]

但是,我想做的是多变量回归。因此,该模型将被CompressibilityFactor(Z) = intercept + coef*Temperature(K) + coef*Pressure(ATM)

如何在scikit-learn中做到这一点?

如果你上面的代码适用于单变量,试试这个

import pandas as pd
from sklearn import linear_model
dataTrain = pd.read_csv("dataTrain.csv")
dataTest = pd.read_csv("dataTest.csv")
# print df.head()
x_train = dataTrain[['Temperature(K)', 'Pressure(ATM)']].to_numpy().reshape(-1,2)
y_train = dataTrain['CompressibilityFactor(Z)']
x_test = dataTest[['Temperature(K)', 'Pressure(ATM)']].to_numpy().reshape(-1,2)
y_test = dataTest['CompressibilityFactor(Z)']
ols = linear_model.LinearRegression()
model = ols.fit(x_train, y_train)
print model.predict(x_test)[0:5]

没错,你需要使用 .values.reshape(-1,2(

此外,如果您想知道表达式的系数和截距:

压缩系数(Z( = 截距 + 系数温度(K( + 系数压力(ATM(

您可以通过以下方式获得它们:

系数 = model.coef_
截距 = model.intercept_

相关内容

  • 没有找到相关文章

最新更新