Last year, I started to get started with Low-Rank expressions. Recently I learned some paper and found that I am not very familiar with this. Now I will record my learning about low-rank expressions.
Currently, Low-Rank expressions are mainly used for the division of sub-spaces, that is, given a set of data, which comes from several sub-spaces, the Low-Rank expression can be used to clustering data from these sub-spaces and locate the data from the specific sub-spaces.
First, there are many methods to separate sub-spaces, such as based on the probability model (because the Gaussian distribution can represent a sub-space, the data based on this method is generally consistent with the Gaussian distribution)
The second is the decomposition-based method, which is generally based on the modification of the existing decomposition method and completed based on multiple iterations.
The next is sparse subspace clustering ), A Sparse constraint is imposed on the Expression coefficient matrix (the entire sparse matrix is obtained through the coefficient constraints on each column ).
The disadvantage of the above three decomposition methods is that they are sensitive to noise and outliers, and the decomposition method is not accurate once there is noise.
Therefore, the method of low rank representation is proposed. Because low rank is a constraint on the overall coefficient matrix, the LLR Method is expressed from a global perspective, secondly, because the noise will increase the rank of the data, the noise is naturally removed under the low rank constraint, so this method is very robust to noise.