Iv. Linear Regression with multiple Variables (Week 2)
-Multiple features
Before we introduced the Univariate/single feature regression model, we now add more variables to the house price forecast model, which is more features, such as the number of rooms, floors, intervention, etc., thus constituting a model with multiple variables/features.
After adding more variables/features, we will introduce a series of new symbols and explanations:
n represents the number of variables/features
Represents the I training instance, which is the line I of the feature Matrix, which is a vector
Represents the J feature of the first training example, that is, the J feature of line I in the characteristic matrix
Therefore, the multivariable/characteristic hypothesis H can be expressed as:
This formula has n+1 parameters and n variables, in order to simplify the formula, we make the formula is represented as:
At this point the parameter in H is a vector of n+1 dimensions, and the training instance is also a vector of the n+1 dimension.
The formula can therefore be simplified to:
where superscript t represents the transpose of the Matrix.
-Gradient descent for multiple variables
Similar to univariate/feature linear regression, in multivariable/feature linear regression, we will also define a cost function, namely:
Our goal is the same as the problem in univariate/characteristic linear regression, which is to find out the combination of parameters that make the cost function least.
Therefore, the multivariable/linear regression gradient descent algorithm is:
That
After the derivative number can be obtained:
After that, we randomly update a series of values and calculate the cost function until it converges.
Coursera Machine Learning Study notes (eight)