Machine Learning: Linear Regression With Multiple Variables, linearregression
Machine Learning: Linear Regression With Multiple Variables
Next, the example of the previous prediction of the house price leads to a multi-variable linear regression.
Here we use the vector representation method to make the expression more concise.
Variable gradient descent all theta values need to be updated synchronously like the documentary variable.
Feature scaling is performed to accelerate the convergence of gradient descent algorithms. As shown in the figure on the left, the magnitude of theta2 differs greatly from theta1. As a result, the height of the Cost Function is an elliptical shape with a high height, and the convergence trajectory is displayed. This is too slow! The picture on the right shows an equal height chart with a feature scaling. The value range of theta is similar, accelerating the convergence.
Feature scaling does not have a strict rule. You only need to make the values of each feature similar.
Here we introduce a feature scaling method, X (I) = (X (I)-average (X)/(max-min ). Of course, other methods can make the feature range reasonable.
In addition, when we perform feature scaling on the training data in the training set, we also need to perform scaling in the same way for a new example during our prediction, in other words, we need to save some data, such as average (X), max, and min.
Next, let's repeat the Learning rate selection problem.
The selection of polynomial order numbers also needs to be considered because it is related to the under-fitting (underfitting) and over-fitting (overfitting) problems, we will discuss it in the subsequent articles.
Next we will introduce a method for finding the optimal theta without many iterations: normal equation (regular equation ).
The following is a process to prove J (theta.
After J (theta) is set to 0, the preceding formula is obtained.
The above content compares the advantages and disadvantages of gradient descent and regular equations.
If x' * X is irreversible (I .e. | x' * X | = 0), what should we do?
Bytes ------------------------------------------------------------------------------------------------------------------
This article is excerpted from the courseware "Machine Learning" by Andrew Ng of Stanford University.
Why is the Department of biological engineering at Ocean University the best?
They are incomparable to other schools in studying the environment and basis of marine biology.
Sat2 mathematical solution: How should we calculate least squares linear regression? Ratio
Must use a computer, Ti-83 or above, hand calculation is too difficult
Steps:
1. Open your computer
2. Select STAT (the second button in the third line represents the meaning of statistics)
3. Select 1. Edit
4. Import the x value into L1 y value into L2
Obtain a table like this.
L1 L2
0 15000
1 13000
2 10900
5 3000
5. Click STAT again, move one cell to the Right to CALC, select 4: LinReg (ax + B), and then confirm that the computer will calculate least squares linear regression for you.
The equation obtained from the question table is y =-2428.571x + 15332.142.
6. Ask the question when x = 4, so if 4 is substituted, The 5617.858 answer is C.