Stanford ml Open Course Notes 15
In the previous note, we talked about PCA ). PCA is a direct dimensionality reduction method. It solves feature values and feature vectors and selects feature vectors with larger feature values to achieve dimensionality reduction.
This article continues with the topic of PCA, including one application of PCA-LSI (Latent Semantic Indexing, implicit semantic index) and one implementation of PCA-SVD (Singular Value Decomposition, Singular Value Decomposition ). After SVD and LSI are completed, the contents of PCA come to an end. The second half of the video begins with an unsupervised learning class-Ica (Independent Component Analysis ).
The PDF in Part 13-15 of this note has been uploaded to the csdn resource. Please clickTu longbao, download and send.
Stanford ml Open Course Notes 15-implicit semantic index, Singular Value Decomposition, Independent Component Analysis