PCA Algorithms and examples

Source: Internet
Author: User
PCA algorithm

Algorithm steps:
Suppose there are M-n-dimensional data.
1. Make the original data column n rows m-column matrix X
2. Each line of x (representing an attribute field) is 0-valued, minus the mean of this line
3. Finding the covariance matrix C=1/MXXT
4. Find the eigenvalues of the covariance matrix and the corresponding eigenvectors
5. The eigenvector is arranged into a matrix according to the corresponding eigenvalue size from top to bottom, and the first k line is composed of the matrix P
6. Y=PX is the data after dimensionality reduction to K dimension

Instance


Taking this as an example, we use the PCA method to reduce this two-dimensional data to one-dimensional
Since each row of this matrix is already a 0 mean, we can ask for the covariance matrix directly:


Then the eigenvalues and eigenvectors are calculated, and the eigenvalues of the solution are:
Λ1=2,λ2=2/5
The corresponding eigenvectors are:

Since the corresponding eigenvector is a general solution, C1 and C2 are desirable for arbitrary real numbers. Then the normalized eigenvectors are:


So our matrix P is:

You can verify the diagonalization of the covariance matrix C:

It is better that we use the first line of the good faith data matrix, we get the data representation after dimensionality reduction:

The projection results after dimensionality reduction are as follows:

PCA essentially takes the direction of the most variance as the main feature, and "correlates" the data in each orthogonal direction, that is, to make them irrelevant in different orthogonal directions.
Therefore, PCA also has some limitations, such as it can be very good to remove the linear correlation, but there is no way for higher-order correlation. For data with high-order correlation, kernel PCA can be considered, and nonlinear correlation is transformed into linear correlation by kernel. In addition, PCA assumes that the characteristics of the data are distributed in orthogonal direction, if there are a few variance in the direction of the non-orthogonal, the effect of PCA is greatly discounted.
PCA is a non-parametric technology, that is, in the face of the same data, if not considered clear, who will do the same results, no subjective parameters involved, so the PCA is easy to implement, but it does not have personalized optimization.

This article main reference: Http://blog.codinglabs.org/articles/pca-tutorial.html

PCA Algorithms and examples

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.