Http://blog.sina.com.cn/s/blog_731140ed0101bozs.html
1 regression generally refers to linear regression, which is the process of finding the least squares solution. Before the regression, it was assumed that all types of points satisfy a curve equation at the same time, the calculation only requires the coefficients of the equation.
2 polynomial interpolation : Use a polynomial to approximate the data list function and require the polynomial to pass a given data point in the list function. (The interpolation curve is going through the type value point.) )
3 polynomial approximation : To find approximate substitution polynomial functions for complex functions, the error is the smallest in some measure. (approximation only requires that the curve is close to the point of value, conforming to the trend of the value point.) )
4 polynomial fitting : Considering the error of a given data point in the interpolation problem, it only requires that when the polynomial approximation is substituted for the list function, the error is the smallest in some measure sense.
Attention:
Table column functions: Given n+1 a different data point (X0,Y0), (x1,y1) ..., (Xn,yn), the function represented by this set of data is a table column function.
Approximation function: To find a function, so that according to a standard, this function y=f (x) can best reflect this set of data is approximation of this table column function, this function y=f (x) called approximation function
Interpolation functions: According to different criteria, can be given a variety of functions, such as the required function y=f (x) in the above n+1 data points out of the function value and the corresponding data points of the ordinate equality, that is Yi=f (x1) (I=0,1,2....N) This function approximation problem is called interpolation problem, called function y= F (x) the interpolation function for the number of locations, Xi is called the interpolation point.
Interpolation and fitting are important parts of function approximation or numerical approximation.
What they have in common is to obtain an unknown continuous function defined in the continuous set S (m contained in s) by some known constraints on the set m of the discrete points, so as to attain the overall law.
Purpose, that is, through "peep a few spots" to achieve "know whole picture".
Simply put, the so-called fitting refers to a number of discrete function values known as a function {f1,f2,..., fn}, by adjusting some of the function's undetermined coefficients f (λ1,λ2,..., λ3), so that the function and the known point set of the difference (least squares meaning) the least. If the undetermined function is linear, it is called linear fitting or linear regression (mainly in statistics), otherwise it is called nonlinear fitting or nonlinear regression. An expression can also be a piecewise function, in which case it is called spline fitting.
Interpolation refers to the function value or derivative information of a function known at several discrete points, which satisfies the constraint on a given discrete point by solving the interpolation function in the function and the undetermined coefficient. The interpolation function is also called as the base function, if the base function is defined on the entire definition field, called the whole domain base, otherwise it is called as the sub-domain base. If the constraint has only
The constraint of the function value is called Lagrange interpolation, otherwise it is called as Hermite interpolation value.
In geometric sense, fitting is given a number of points in the space, finding a continuous surface with unknown parameters of known form to maximize the approximation of these points, and the interpolation is to find one (
or several shards of smooth) continuous surfaces to pass through these points.
The difference between regression, interpolation, approximation, and fit