The least squares method, also known as the least squares approach, is a data optimization technique that matches by minimizing the squared error and finding the best function for the data.
The least square method, which was first proposed by Galton during the establishment of regression analysis, has become the most important way to explore the relationship between variables, and the least squares is named according to its mathematical principle, that is, the error squared and minimum, the function parameter estimation in the error square and minimum state, can be considered as the best estimation of parameters.
first, the question of the proposed
We are studyingthe relationship between variablesWill collect a certain amount ofData Samples, these data aretwo-dimensional coordinate chartData points, theoretically, if the variablesThere are known functional relationships identified, the function image (curve or line) passes through all the data points, and usually we get the data is the sample data, the sample data has the error, causes us to calculate according to the sample data function and the known function also has the error, its function image cannot pass through all data points.
The error makes the function images presented by the sample data of multiple experiments to be different, even if the same set of sample data, if there is no uniform standard, then different people to depict the rendered function image will be different.
And we're going to find a standard that makes the sample data from these errorsfind out an approximate function, making itclosest to the known function, the image of the approximate function does not have to go through all the data points, but as much as possiblemakes all data points symmetrical and evenly distributed on both sides of it, and this line can not only reflect the overall distribution of the data, but also the occurrence of large local fluctuations, through aapproximate functionGo to fitknown functions, this is the curve fitting problem.
Generally speaking. There are two tasks for curve fitting:
1. When the function relationship between variables is known, only the constants are unknown, and the best estimates of each constant are fitted according to the data points.
2. When the function relation between variables is unknown, the empirical formula of the function relation between variables is fitted according to the data points, and the best estimate of each constant is obtained.
second, the problem of the solution
As I said earlier, if there is not a single standard, you can find countless approximate functions, we can think according to the following principles:
1. In order to accurately and comprehensively describe the relationship between variables, you must use all the observations of this set of variables (in fact, it is difficult to do, generally use the sample data)
2. Determine whether the relationship between variables is a straight line or a curve, depending on covariance or correlation coefficients
3. The "Best" approximation function, which should be the smallest deviation from the known function, is the minimum vertical distance of all data points to the function image (curve or line).
In accordance with the above principles, we will analyze:
Set the known function y=f (x), the approximate function is φ (x)
Order Δi=yi-φ (xi)
Δil is the residual, so the minimum residual, there are different methods
The fourth of them-the sum of squares and the smallest deviations-is the least squares.
In the actual application, the sample data are not all equal precision, equal status, for high precision, the status of heavy data should be given greater weight, at this time to use the weighted least squares.
===============================================================
The regression equation using least squares estimation has the following properties:
1. Unbiased
It can be proved that the expectation (mean) of each parameter in the regression equation estimated by least squares is equal to the parameter value in the real equation.
Suppose a linear regression equation is in the form of the following
The regression equation using the least squares estimator is
Then there are
2. Linear
It can be proved that the parameter estimated using least squares is a linear function of Y.
3. Minimum variance
For the same sample, different methods are used to estimate the parameters, the parameters may be unbiased and linear, but in these parameters it can be proved that the least square method is used to estimate the minimum variance of the parameters.
The above three properties are also to evaluate whether a parameter is a good parameter reference standard, because the least squares estimator has these three properties at the same time, so also known as the least squares estimator for the best linear unbiased estimators of optimal linear unbiased estimate, referred to as Blue
Under the Gaussian classical hypothesis, we do not need to look for other unbiased estimators, and none of them will be better than the ordinary least squares estimator. In other words, if there is a good linear unbiased estimator, the variance of this estimator is as small as the variance of the ordinary least squares estimator and not less than the variance of the ordinary least squares estimator, which is called the Gaussian-Markov theorem. It is precisely because the least squares estimator has a blue nature that the least squares method is widely used, but it is important to note that the good properties of the least squares estimator depend on the Gaussian basic hypothesis.
For a practical curve fitting problem, it is common practice to draw a scatter plot on two-dimensional coordinate plane, observe the distribution of scatter plot and which kind of graph is close, then select the corresponding curve fitting equation, for some nonlinear fitting curve can be transformed into linear curve by suitable variable substitution, It's easier to follow a linear fit, which lists some common transformations
Http://www.cnblogs.com/xmdata-analysis/p/5048446.html
Least squares (RPM) Good