LinearRegression
Fits a linear model with coefficients to minimize the residual sum of squares between the observed responses in the Datas ET, and the responses predicted by the linear approximation. Mathematically it solves a problem of the form:
Minimization of principle:
from sklearn Import Linear_model>>> CLF = Linear_model. Linearregression ()>>> clf.fit ([[00], [11], [22 ]], [012]) linearregression (copy_x=true, Fit_intercept=true, N_jobs =1, normalize=False)>>>0.5, 0.5])
Complete Proxy Example
#!/usr/bin/env python#Coding=utf-8ImportMatplotlib.pyplot as PltImportNumPy as NP fromSklearnImportDatasets,linear_modelPrint(__doc__)#Load DataSetDiabetes =datasets.load_diabetes ()#Use only one featureDiabetes_x =diabetes.data[:,np.newaxis]diabetes_x_temp= diabetes_x[:,:,2]#split data into training/testing setsDiabetes_x_train = diabetes_x_temp[:-20]diabetes_x_test= diabetes_x_temp[-20:]#split the targets into training/testing setsDiabetes_y_train = diabetes.target[:-20]diabetes_y_test= diabetes.target[-20:]#Create linear Regression objectRegr =Linear_model. Linearregression () Regr.fit (Diabetes_x_train,diabetes_y_train)#The coefficientsPrint('coefficients: \ n', Regr.coef_)#The mean square errorPrint("residual sum of squares:%.2f"% Np.mean ((regr.predict (diabetes_x_test)-diabetes_y_test) **2))#Plot outputsPlt.scatter (diabetes_x_test,diabetes_y_test,color='Black') Plt.plot (Diabetes_x_test,regr.predict (diabetes_x_test), Color='Blue', linewidth=3) Plt.title ("Linear_model Example") Plt.xlabel ("X") Plt.ylabel ("Y")#Plt.xticks (())#Plt.yticks (())plt.show ()
View Code
Turn from:
Http://scikit-learn.org/dev/auto_examples/linear_model/plot_ols.html#example-linear-model-plot-ols-py
Getting started with Sklearn linear models