Ufldl Study Notes and programming assignments: Linear Regression (linear regression)

Source: Internet
Author: User


Ufldl Study Notes and programming assignments: Linear Regression (linear regression)


Ufldl provides a new tutorial, which is better than the previous one. Starting from the basics, the system is clear and has programming practices. In the high-quality deep learning group, you can learn DL directly without having to delve into other machine learning algorithms.

So I started to do this recently. The tutorial, coupled with Matlab programming, is perfect.

The address of the new tutorial is: http://ufldl.stanford.edu/tutorial/


Link to this study: http://ufldl.stanford.edu/tutorial/supervised/LinearRegression/


From the simplest linear regression, we can clearly see the general idea of modeling to solve the problem.

1. Define the target function;

2. optimization objective function: Obtain the partial derivative and gradient. By means of optimization, such as gradient descent and upyun development. Find the optimal solution.


The exercises here are quite special and we don't need to implement the gradient descent method by ourselves.

Instead, the partial derivative of the target function is obtained for the parameter, and the remaining optimization work is handed over to a function called minfunc.


Originally, this section only requires the readers to implement with the simplest for loop. The following section requires the use of the vectorized method.

As we are familiar with linear regression, we will be lazy here and use the vectorized method directly.


The linear_regression.m code is as follows:

Function [f, g] = linear_regression (Theta, x, y) % arguments: % theta-a vector containing the parameter values to optimize. % x-the examples stored in a matrix. % x (I, j) is the I 'th coordinate of the J 'th example. % Y-the target value for each example. Y (j) is the target for example J. % m = size (x, 2); % columns n = size (x, 1); % number of rows f = 0; G = zeros (SIZE (theta )); H = Theta '* X; F = (1/2) * H'; % the target function is incorrect at the beginning. In fact, the target function is the cost function, instead of assuming the function g = x * (h-y) '); % todo: Compute the linear regression objective by looping over the examples in X. % store the objective function value in 'F '. % todo: Compute the gradient of the objective with respect to theta by looping over % the examples in X and adding up the gradient for each example. store the % computed gradient in 'G '.

The result is as follows:




For vectorized programming, we feel that we need to have an impression on all matrices in our minds.

If you are not impressed, just draw more on paper.

I have also written an article titled 《

Machine Learning: Linear Regression

This is mentioned in.


In fact, when I did this assignment tonight, I encountered two pitfalls.

The first one is to find the error f. I thought f is to evaluate the value H of the hypothetical function, which actually requires the target function and the cost function.

At the beginning, I also saw the function called in the library function minfunc reported an error, thinking that the Code provided by others had a bug.

Later, I found myself wrong.

The second is to call the C code by Ave ave. For example, lbfgsaddc. C and lbfgsprodc. C. These two files are in the Mex folder.

After checking the relevant information, you will know that you must first compile the file as a Mex file to be called by Ave ave.

M files are generally in the same directory as Mex files. You can also specify folders.



Https://www.gnu.org/software/octave/doc/interpreter/Getting-Started-with-Mex_002dFiles.html#Getting-Started-with-Mex_002dFiles

Compile C as Mex:

mkoctfile --mex myhello.c

The mkoctfile is in the bin directory of Ave ave, which also calls GCC and G ++.
Therefore, you must add the GCC and G ++ directories to the environment variables.




Linger

Link: http://blog.csdn.net/lingerlanlan/article/details/38377023






Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.