ordinary Least squares Ordinary least squares
When the minimum value is reached, the best fit line is reached
The minimum value of the coefficient w minimum two quadratic equation can be obtained by using the partial derivative of the W
Another form of expression that is equivalent to the above:
can also be simplified into
Derivation process:
Ridge Regression Ridge return
The problem arises because the upper form is in multiple collinearity and becomes 0.
This problem can be eliminated by transforming it into the following equation.
K is the ridge parameter, when K is 0, the least squares solution is obtained, and when the ridge parameter tends to be larger, the ridge regression coefficient tends to 0
Knowledge Points:
Multi-collinearity
There is a high correlation between variables in regression models
Biased and unbiased estimates
Personal understanding (to be modified ): Because the direct calculation is difficult, by the transformation or by adding the auxiliary factor to calculate, the transformed computation is the biased estimate
Calculations without transformations are unbiased estimates
Generalized Linear Models general linear Model