Advantages and disadvantages of various algorithms:

Source: Internet
Author: User

1.SRC

1), src the noise in the test concentration is relatively robust, but when the training concentration also contains noise, the effect is often poor, because SRC needs a pure training set to train a very good dictionary.

"Exact words: Bing-kun, B., Guangcan, L., Richang, H., Shuicheng, Y. and Changsheng, X." General subspace Learning with Corrup Ted training data via graph embedding, IEEE Trans. On Image Process, 22, 4380-4393. "Representation Classifier (SRC) [24] is a robust supervised method and can correct the corruptions possibly existing in testing data, but cannot well handle t He cases where the training data themselves is corrupted [25].

2 PCA

PCA is the most widely used dimensionality reduction method and error correction method, however, in practical application, when there is gross corruptions (pollution, occlusion), PCA can not grasp the real sub-spatial structure of data very well, so the effect is poor, especially when the shielding amplitude is large, the effect is worse.

3 RPCA

RPCA is to deal with the problem of PCA occlusion and so forth.

1) can deal with the sparse noise problem very well, but he is an unsupervised method, can not use the label information to increase the recognition rate.

2) can not process new samples, even if the new samples can be processed, every new sample, all training samples need to be recalculated, it is time-consuming ~

A new sample processing method of RPCA is, according to the training sample obtained subspace Matrix Y=u∑v ', to obtain the projection matrix U to process a new sample; but there is one drawback is that this projection matrix does not work well with the original training sample matrix X, proving that: the principal component is uu*x, the error e=x-uu*x, At this point, the error E is not sparse.

3) Advantages of nuclear constraints: ability to discover low-rank structures between data classes

4, Figure embedding method, Pca,lda,nmf,mfa,mnge,pnge, etc.

1) The training set has occlusion situation, the effect is not good.

5. Linear regression method

1) The class standard seems to be not centralized. But the literature "Cai, X., Ding, C., Nie, F. and Huang, H." On the equivalent of Low-rank linear regressions and linear discrimin Ant analysis based regressions. In:proceedings of ACM sigkdd Int. Conf. Knowl. Discovery Data Mining. 1124-1132. "It is mentioned that if the training sample matrix X is centralized, then whether the class-matrix Y is centered is equivalent."

6. LDA

(1) Ratio-of-trace or trace-of-ratio problem, generally trace-of-ratio is equivalent to the method of linear regression.

(2) In the case of full rank, multiple linear regression is equivalent to LDA, (for min| | y-xab| |, when B is full rank)

Ye proves that the dimensionality of dimensionality is equivalent to k-1 and rank (Sb) +rank (Sw) = Rank (St).

Advantages and disadvantages of various algorithms:

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.