The last time we shared a multivariate linear regression model (Linear Regression with multiple Variables), let's talk about polynomial regression (polynomial Regression) and the normal equation (normal equation). (we still discuss the problem of house price forecast)
Polynomial regression
Sometimes, linear regression does not apply to all of the data, we need curves to fit our data, such as a two-square model:
Or a three-cubic-square Model:
The two models we plotted in the coordinate system are as follows:
In general, we need to look at the data before deciding what model to use to deal with the problem.
In addition, we can also make:
This transforms the higher order equation model into a linear regression model. This is also a feature scaling (Features scaling) .
PS: If our model uses a polynomial regression model, feature scaling is necessary before the gradient descent algorithm is run.
Normal equation
So far, the machine learning algorithms we have studied are all gradient descent algorithms (grandient descent). But for some regression problems, there is a better solution to the normal equation.
The normal equation is to find the parameter that minimizes the cost function by solving the following equation:
Assuming our training feature matrix is x (which contains x0=1) and our training set results in Vector y, the normal equation is used to solve the vector:
Callout: T denotes transpose of matrix X, 1 indicates inverse of matrix X
We use the price forecast problem data:
The data includes four features (excluding X0), we add x0=1, and we use the normal equation method to solve:
In matlab, normal equation writing: PINV (x ' * x) *x ' *y
There is a need to be aware that there are some irreversible matrices (usually because the features are not independent, such as the size of both feet and the size of the meter, it is possible that the number of features is greater than the number of training sets, such as 2000 features but only 1000 training sets), The normal equation method is not able to be used.
So we now have two machine learning algorithms, one is the gradient drop, one is the normal equation, two methods are compared as follows:
At this point, the question of linear regression is discussed here. In the next phase we will discuss logistic Regression.
Machine learning--polynomial regression and normal equation