機器學習實戰:多變數線性迴歸的實現

來源:互聯網
上載者:User

多元線性迴歸其實方法和單變數線性迴歸差不多,我們這裡直接給出演算法:

computeCostMulti函數
function J = computeCostMulti(X, y, theta)m = length(y); % number of training examplesJ = 0;predictions = X * theta;J = 1/(2*m)*(predictions - y)' * (predictions - y);end

gradientDescentMulti函數

function [theta, J_history] = gradientDescentMulti(X, y, theta, alpha, num_iters)m = length(y); % number of training examplesJ_history = zeros(num_iters, 1);feature_number = size(X,2);temp = zeros(feature_number,1);for iter = 1:num_itersfor i=1:feature_numbertemp(i) = theta(i) - (alpha / m) * sum((X * theta - y).* X(:,i));endfor j=1:feature_numbertheta(j) = temp(j);end J_history(iter) = computeCostMulti(X, y, theta);endend
但是其中還是有一些區別的,比如在開始梯度下降之前需要進行feature Scaling:
function [X_norm, mu, sigma] = featureNormalize(X)X_norm = X;mu = zeros(1, size(X, 2));sigma = zeros(1, size(X, 2));mu = mean(X);sigma = std(X);for i=1:size(mu,2)X_norm(:,i) = (X(:,i).-mu(i))./sigma(i);endend
Normal Equation演算法的實現
function [theta] = normalEqn(X, y)theta = zeros(size(X, 2), 1);theta = pinv(X'*X)*X'*y;end

聯繫我們

該頁面正文內容均來源於網絡整理,並不代表阿里雲官方的觀點,該頁面所提到的產品和服務也與阿里云無關,如果該頁面內容對您造成了困擾,歡迎寫郵件給我們,收到郵件我們將在5個工作日內處理。

如果您發現本社區中有涉嫌抄襲的內容,歡迎發送郵件至: info-contact@alibabacloud.com 進行舉報並提供相關證據,工作人員會在 5 個工作天內聯絡您,一經查實,本站將立刻刪除涉嫌侵權內容。

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.