MATLAB partial least squares function plsregress

Source: Internet
Author: User

function [Xloadings,yloadings,xscores,yscores, ...
Beta,pctvar,mse,stats] = plsregress (x,y,ncomp,varargin)

Calculates the regression of Y on X with a ncomp factor or a potential variable, returning the predicted and response loads.

X is the n*p predictor variable, the row corresponds to the observation item, and the column corresponds to the variable.

Y is the n*m response matrix.

Xloadings are p*ncomp factor loads , each containing a linear combination of coefficients that define the original predictor variables .

Xscores returns the Predictor score, that is, the pls factor is a linear combination of variables in X, Xscores is a n*ncomp orthogonal matrix, the row corresponds to the observation value, and the column corresponds to the factor.

The factor load matrix is the coefficient of the factor expression of each primitive variable , and the degree of the influence of the extracted common factor on the original variable is expressed.

The factor score matrix indicates the relationship between the index variables and the extracted common factors, and the higher scores on a common factor indicate that the relationship between the index and the common factor is closer.

Simply put, the linear combination of the original indicator variables can be obtained by the factor load moment matrix.

such as X1=A11*F1+A12*F2+A13*F3, which X1 for the indicator variables x1,a11, A12, A13 respectively with the variable X1 in the same row of the factor load, F1, F2, F3 respectively for the extraction of the public factor;

The linear combination of the common factor can be obtained by the Factor score matrix , such as f1=a11*x1+a21*x2+a31*x3, the meaning of the letter is ibid. 、

Beta for the coefficients of the regression model

Pctvar is a two-line matrix, the first behavior of the self-variable contribution rate of the extracted ingredient, the second behavior of the contribution rate of the variable-derived ingredient

The MSE is a two-line matrix, and the first line of the J element represents the residual standard deviation of the regression equation between the independent variable and its former j-1, and the second line of the J element corresponds to the residual standard deviation of the regression equation between the dependent variable and its former j-1 components;

Stats returns a value of 4 stats. W is a pls-weighted ρ*ncomp matrix that makes xs=x0*w, each column corresponds to a eigenvectors ρi

Stats. T2 is the t^2 statistic for each point of XS

Stats. Xresiduals for predictive value residuals, X0-XS*XL '

Stats. Yresiduals for response residuals, Y0-xs*yl '

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.