Robot Learning Cornerstone (Machine learning foundations) Learn Cornerstone Job four q13-20 MATLAB implementation

Source: Internet
Author: User

Hello everyone, I am mac Jiang, today and everyone to share the coursera-ntu-machine learning Cornerstone (Machines learning foundations)-Job four q13-20 MATLAB implementation. The previous code was implemented through C + +, but found that C + + implementation of the code is too cumbersome, the job also to change the parameter values frequently, so choose to use MATLAB to achieve. Compared with C + +, Matlab is obviously much easier to implement and more convenient for data import. Although my code can get the correct answer, but there may be some ideas or details are wrong, if you bo friends found, please timely message correction, thank you! Again, Bo owners to provide the implementation of the code is not to let you pass the test, but for students who have difficulties in learning to provide a solution, I hope that my article on your study has some help!

The source of this article: http://blog.csdn.net/a1015553840/article/details/51173020

For additional questions, see the summary: http://blog.csdn.net/a1015553840/article/details/51085129

1.sign function

function S = sign (x)  % calculation Sign[m,n] = size (x); For i = 1:m, for    j = 1:n;        If X (i,j) <= 0,            S = 1;        else            S =-1;        End    Endendend

2. Calculation of regularization linear regression function lgwithregularization

function Wreg = lgwithregularization (X,Y,LAMBDA) [m,n] = size (X); WREG = INV (x ' * x + lambda * Eye (n)) * x ' * y;% regularization linear regression solution end

3. Error calculation function Error01 (Note that 0/1 error is used here)

function E = Error01 (X,y,wreg) [m,n] = size (X); E = 1-sum (sign (X * wreg) = = y)/m;% calculation error rate end

4. Main process

Clctrainingdata = Load (' trainingData.txt '); Xtrain = Trainingdata (:, [1, 2]); Ytrain = Trainingdata (:, 3); testData = Load (' testData.txt '); Xtest = TestData (:, [+]); ytest = TestData (:, 3); [M,n] = size (Xtrain); Xtrain = [Ones (m,1), Xtrain]; [A, b] = size (Xtest); Xtest = [Ones (a,1), Xtest];%13-15%lambda = 10^-3;%wreg = Lgwithregularization (XTRAIN,YTRAIN,LAMBDA);%ein = Error01 ( Xtrain,ytrain,wreg)%eout = Error01 (xtest,ytest,wreg)%16-17%lambda = 10^-3;%wreg = Lgwithregularization (Xtrain (1:120, :), Ytrain (1:120,:), lambda);%etrain = Error01 (Xtrain (1:120,:), Ytrain (1:120,:), wreg)%eval = Error01 (Xtrain (121:200, :), Ytrain (121:200,:), wreg)%eout = Error01 (xtest,ytest,wreg)%18%lambda = 10^0;%wreg = Lgwithregularization (Xtrain, YTRAIN,LAMBDA);%ein = Error01 (xtrain,ytrain,wreg)%eout = Error01 (xtest,ytest,wreg)%19%lambda = 10^-6%ecv = 0;%v = 5;%per = M/v;%for i = 1:v,% xtemp = xtrain;% ytemp = ytrain;% xtemp (1+ (i-1) *per:i*per,:) = [];% out for cross-validation samples% ytemp ( 1+ (i-1) *per:i*per,:) = [];% wreg = LgwithrEgularization (XTEMP,YTEMP,LAMBDA);% Error01 (Xtrain (1+ (i-1) *per:i*per,:), Ytrain (1+ (i-1) *per:i*per,:), wreg)% Use cross-validated samples for ecv% ECV = ECV + Error01 (Xtrain (1+ (i-1) *per:i*per,:), Ytrain (1+ (i-1) *per:i*per,:), wreg);%end%ecv = ecv/v%20% Lambda = 10^-8;%wreg = Lgwithregularization (XTRAIN,YTRAIN,LAMBDA);%ein = Error01 (xtrain,ytrain,wreg)%eout = Error01 ( Xtest,ytest,wreg)


13. Question 13th


(1) Test instructions: Download training samples and test samples from two websites, use regularization linear regression, parameter lambda take 10, get Ein and Eout

(2) Answer: Ein = 0.050 eout = 0.045


14-15: The question of 第14-15


(1) Test instructions: 14. Take LAMDA value respectively. Calculate Ein and Eout. Choose the correct answer for the smallest ein, and if the answer is two lambda, select a large lambda

15. Select the correct answer for the minimum eout

(2) Answer: 14.log =-8, Ein = 0.015,eout = 0.02

15.log = -7,ein = 0.03,eout = 0.015


16. Question 16th

(1) Test instructions: Using the first 120 samples as a training sample, the last 80 samples as test samples, respectively, calculate the corresponding etrain,eval,eout of different lambda, select the minimum etrain corresponding answer

(2) Answer: Log = -8,etrain = 0, Eval = 0.05, eout = 0.025


17. Question 17th

(1) Test instructions: As with 16, select the correct answer for the minimum eval

(2) Answer: Log = 0, Etrain = 0.0333,eval = 0.0375,eout = 0.0280


18. Question 18th

(1) Test instructions: Using the optimal lambda from 17, all samples are used as training samples to calculate ein,eout

(2) Answer: Ein = 0.035 eout=0.02


19-20: The question of 第19-20

(1) Test instructions: 19. Divide the sample into 5 parts, calculate the ECV by the method of cross-validation, calculate the minimum ecv

20. Calculate Ein,eout with the lambda value corresponding to the minimum ecv that is obtained by 19

(2) Answer: 19. Log=-8, Eval = 0.03

20.Ein = 0.015,eout = 0.02


The source of this article: http://blog.csdn.net/a1015553840/article/details/51173020

For additional questions, see the summary: http://blog.csdn.net/a1015553840/article/details/51085129


Robot Learning Cornerstone (Machine learning foundations) Learn Cornerstone Job four q13-20 MATLAB implementation

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.