Stanford ml Public Lesson Note 15
In our last note we talked about PCA (principal component analysis).
PCA is a kind of direct dimensionality reduction method. By solving eigenvalues and eigenvectors, and selecting some characteristic vectors with large eigenvalues, the effect of dimensionality reduction is achieved.
This paper continues the topic of PCA, which contains an application of PCA--lsi (latent Semantic indexing, implied semantic index) and an implementation of PCA--SVD (Singular value decomposition, mystic value decomposition).
After the SVD and LSI end. The content of PCA is over.
The second half of the video begins with a--ica of unsupervised learning (independent Component analysis, independent component analyses).
PDF of the 13-15 part of this note has been uploaded csdn resources, download please bash Dragon Slayer Blades. Download it and send it .
Stanford ml Public Lesson Note 15-Implicit semantic indexing, mystic value decomposition, independent component analysis