effects of different λ (0,1,10,100) values on regularization \ Predicting new values and calculating the accuracy of the model
%% ============= part 2:regularization and accuracies =============
% Optional Exercise:
% in this section, you'll get to try different values of lambda and
% See how regularization affects the decision Coundart
%
% Try the following values of lambda (0, 1, ten).
%
% How does the Decision boundary When do you vary lambda? How does
% the training set accuracy vary?
%
% Initialize Fitting Parameters
Initial_theta = zeros (Size (X, 2), 1);
% Set regularization parameter lambda to 1 (you should vary this)
lambda = 1; % here Set λ= (0,1,10,100)
by visible, lambda=1 when the effect is best,
λ=0 No regularization (overfitting);
λ=100 will too much regularization (underfitting),
% Set Options
options = optimset (' GradObj ', ' on ', ' Maxiter ', 400); % calculation gradient, number of iterations is 400 times
% Optimize
[Theta, J, exit_flag] = ...
Fminunc (@ (t) (Costfunctionreg (t, X, Y, Lambda)), Initial_theta, options);
% Plot Boundary
Plotdecisionboundary (Theta, X, y); %x's been mapfeature.
Hold on;
The title (sprintf (' lambda =%g ', lambda)) % automatically selects a format in%e and%f, with no suffix 0.
% Labels and Legend
Xlabel (' Microchip Test 1 ')
Ylabel (' Microchip Test 2 ')
Legend (' y = 1 ', ' y = 0 ', ' decision boundary ')
Hold off;
% Compute accuracy on our training set
p = Predict (theta, X);
fprintf (' Train accuracy:%f\n ', mean (double (p = = y)) * 100);
PLOTDECISIONBOUNDARY.M file
function Plotdecisionboundary (theta, X, y)
%plotdecisionboundary plots the data points X and y into a new figure with
%the decision boundary defined by Theta
% Plotdecisionboundary (theta, x, y) plots the data points with + for the
% positive examples and O for the negative examples. X is assumed to be
% a either
% 1) Mx3 matrix, where the first column is a all-ones column for the
% intercept.
% 2) MxN, n>3 Matrix, where the first column is All-ones
% Plot Data
PlotData (X (:, 2:3), y);
On
If size (X, 2) <= 3
% need 2 points to define a line, so choose, endpoints
plot_x = [min (X (:, 2))-2, Max (X (:, 2)) +2];
% Calculate The decision boundary line
Plot_y = ( -1./theta (3)). * (Theta (2). *plot_x + theta (1));
plot ( Plot_x, plot_y)
% Legend, specific for the exercise
legend (' admitted ', ' not admitted ', ' decision boundary ')
axis ([+, +, +])
else %x already mapfeature (with 28 features), call this part of the program
% here is the grid range
u = Linspace (-1, 1.5, 50);
v = Linspace ( -1, 1.5,);
z = zeros (Length (u), Length (v));
% Evaluate z = theta*x over the grid
For i = 1:length (u)
for j = 1:length (v)
Z (i,j) = mapfeature (U (i), V (j)) *theta;
End
End
z = z '; % important to transpose Z before calling contour
% Plot z = 0
% Notice need to specify the range [0, 0]
Contour (U, V, z, [0, 0], ' linewidth ', 2) % draw contour lines. Contour (X,y,z,[v v]) to draw contours for the single level v.
End %if Size (X, 2) <= 3 Else End
Hold off
End
PREDICT.M file
function P = Predict (theta, X)
%predict PREDICT Whether the label is 0 or 1 using learned logistic
%regression Parameters Theta
% P = PREDICT (theta, X) computes the predictions for X using a
% threshold at 0.5 (i.e., if sigmoid (Theta ' *x) >= 0.5, predict 1)
m = Size (X, 1); % Number of training examples
% need to return the following variables correctly
p = Zeros (M, 1);
% ====================== YOUR CODE here ======================
% Instructions:complete The following code to make predictions using
% your learned logistic regression parameters.
% should set p to a vector of 0 ' s and 1 ' s
%
For i=1:m
If Sigmoid (X (i,:) * theta) >=0.5
P (i) = 1;
Else
P (i) = 0;
End
End
% =========================================================================
End
Matlab (8) regularized Logistic regression: Effects of different λ (0,1,10,100) values on regularization, corresponding to different decision Boundary\ Predicting new values and calculating the accuracy of the model PREDICT.M