This article is reproduced from: http://blog.sina.com.cn/s/blog_8eee7fb60101g25j.html
Curve fitting: (Linear regression method: LM)
1, x sort
2, to find the linear regression equation and give a new variable
Z=LM (Y~x+i (x^2) + ...)
3, Plot (x,y) #做y对x的散点图
4, lines (x,fitted (z)) #添加拟合值对x的散点图并连线
curve Fitting: (NLS)
LM is to line up the curve and do regression, the NLS is a direct fitting curve.
Three conditions are required: The curve equation, the data position, and the estimated value of the coefficients.
If the curve equation is more complex, you can first name a custom function.
Cases:
F=function (x1, x2, A, b) {a+x1+x2^b};
Result=nls (X$y~f (x$x1, x$x2, A, b), Data=x, Start=list (A=1, b=2));
#x可以是数据框或列表, but not a matrix
#对系数的估计要尽量接近真实值, if the difference is too far will be the error: "Singular gradient"
Summary (result); #结果包含对系数的估计和p值
The lines curve can be used directly on the scatter chart according to the estimated coefficients.
curve fitting: (partial regression)
Lowess (x, y=null, F = 2/3, iter = 3)
#可以只包含x, you can also use X, y two variables
#f为窗宽参数, the bigger the smoother
#iter为迭代次数, the bigger the calculation, the slower it gets.
Loess (y~x, data, span=0.75, degree=2)
#data为包含x, y data sets; span is window width parameter
#degree默认为二次回归
#该方法计算1000个数据点约占10M内存
Example:
X=seq (0, 10, 0.1); Y=sin (x) +rnorm (#x的值必须排序)
Plot (x,y); #做散点图
Lines (lowess (x,y)); #利用lowess做回归曲线
Lines (X,predict (loess (y~x))); #利用loess做回归曲线, the predict is the regression prediction value
Z=loess (Y~X); Lines (x, Z$fit); #利用loess做回归曲线的另一种做法