1. Introduction:
It is mainly Andrew W. Moore's courseware predicting real-valued outputs: An Introduction to regression learning notes (gradually completed ).
2. Single Parameter Linear Regression
This section has been attached to the first chapter of PRML learning notes. Note that the final optimal solution is very simple, with a partial derivative of 0.
Corresponding to the minimum value
Then we can use this to predict new values.
3. Multivariate Linear Regression
Note that in the preceding example of Curve Fitting in PRML, the input is not multivariate and is still f (x)-> Y, X, Y is a single variable rather than a vector (but W is a multivariate parameter ). Of course, from the perspective of basis function, it can also be seen as multiple regression.
Why is the above most likely to be matched?In fact, this is a problem of least square and linear equations. For r training input data, each data has m input dimension matrices = if the solution space contains, then the equation is resolved. At this time, we can say that the corresponding approach to the training data is best, with an error of 0. If the solution space does not exist, we can only try to minimize it. We obtain the closest vector. For more information, see <linear algebra and Its Application> p359, which is the projection of the column space to be taken.
Orthogonal projection is unique.The column is linear independent.There is a unique Least Square solution, which is shown in the figure above.
4. constants in Linear Regression
Consider the case that a linear regression does not pass through the origin. The common practice is to add a dimension to the input data, and its value is fixed to 1.
5. Variance: Linear Regression with varing Noise
Suppose you know the variance of the noise added to each data point. How to Use MLE to estimate W?
6. Non-Linear Regression)
Suppose you know that the relationship between Y and X is a non-linear dependency relationship related to W. How to estimate W by using MLE?
Common Methods
7. Polynomial Regression)
The second figure below provides an example, that is, adding a similar item. Considering the polynomial curve fitting given in PRML, it should belong to the M and order of a one-dimensional input.