http://blog.csdn.net/pipisorry/article/details/43529845
Machine learning machines Learning-andrew NG Courses Study notes
Multivariate linear regression multivariable linear programming
(linear regression works with multiple variables or with multiple features)
multiple Features (variables) multi-feature (variable)
{x superscript I represents the first trainning example; The x subscript I represents the first value of a particular trainning example}
The hypothesis for linear regression with multiple features (variables) representation of hypothetical functions for multivariable linear regression
additional zero feature x0 (For ease of representation)
For every example I has a feature vector X superscript i and X superscript i subscript 0 is going to being equal to 1.
Gradient descent for multiple variables multi-variable gradient descent
Model representation
The cost func minimum is solved by gradient descent algorithm to find parametersθ
{The left side is the gradient descent algorithm of the single variable linear programming solution parameters;
Right is an algorithm for solving parameters of multivariable linear programming}
Gradient descent in practice i-feature scaling gradient descent Practice 1-feature scaling
Gradient descent in practice ii-learning rate Gradient Descent Practice 2-Learning rate
Features and polynomial regression features and polynomial regression
Normal equation general equation
from:http://blog.csdn.net/pipisorry/article/details/43529845
Machine Learning-iv. Linear Regression with multiple Variables (Week 2)