1. Find the costfunction to measure the error
2. Fit the theta parameter to minimize the costfunction. Uses gradient descent, iterates n times, iteratively updates Theta, and reduces costfunction
3. Find the appropriate parameter theta for prediction.
1. Linear Regression
Computecost:
for i=1:m h = X(i,:) * theta; J = J + (h - y(i))^2;endJ = J / (2*m);
Gradient Descent process, fitting parameter Theta
for iter = 1:num_iters sum = zeros(size(theta,1),1); for j = 1:size(theta,1) for i = 1:m h = X(i,:) * theta; sum(j) = sum(j) + (h - y(i))*X(i,j); end % theta(j) = theta(j) - alpha * sum / m; %go wrong! simultaneously update theta end theta = theta - sum .* alpha ./ m; % Save the cost J in every iteration J_history(iter) = computeCostMulti(X, y, theta);end
Ii. Logistic Regression
Costfunction
function [J, grad] = costFunctionReg(theta, X, y, lambda)%COSTFUNCTIONREG Compute cost and gradient for logistic regression with regularization% J = COSTFUNCTIONREG(theta, X, y, lambda) computes the cost of using% theta as the parameter for regularized logistic regression and the% gradient of the cost w.r.t. to the parameters. % Initialize some useful valuesm = length(y); % number of training examples% You need to return the following variables correctly J = 0;grad = zeros(size(theta));for i=1:m J = J - y(i)*log(h_fun(X(i,:), theta)) - (1-y(i))*log(1-h_fun(X(i,:),theta));endJ = J / m;reg = 0;for j=2:size(theta) reg = reg + theta(j)^2;endreg = reg * lambda /(2*m);J = J + reg;for i=1:m grad(1) = grad(1) + (h_fun(X(i,:),theta) - y(i))*X(i,1);endgrad(1) = grad(1) / m;for j=2:size(theta) for i=1:m grad(j) = grad(j) + (h_fun(X(i,:),theta) - y(i)) * X(i,j) + lambda*theta(j)/m; end grad(j) = grad(j) / m;endend
Parameter fitting
% Initialize fitting parametersinitial_theta = zeros(size(X, 2), 1);% Set regularization parameter lambda to 1 (you should vary this)lambda = 0;% Set Optionsoptions = optimset('GradObj', 'on', 'MaxIter', 400);% Optimize[theta, J, exit_flag] = ... fminunc(@(t)(costFunctionReg(t, X, y, lambda)), initial_theta, options);