NOTES: Tensor rpca:exact recovery of corrupted Low-rank tensors via convex optimization

Source: Internet
Author: User

Lu, C., et al, Tensor robust principal component Analysis:exact recovery of corrupted Low-rank tensors via convex opt Imization, in IEEE Conference on computer Vision and Pattern recognition. . P. 5249–5257.
This article is the note of this CVPR conference paper, mainly on the theoretical method of the article to carry out a detailed explanation. My academic level is limited, if there is any mistake in the text, please correct me.

absrtact: The problem in this paper is Tensor robust Principal Component analysis (TRPCA) problem, which expands the RPCA condition of the matrix. The model of this paper is based on a new tensor singular value decomposition (T-SVD), and its derived tubal rank and tensor kernel norm. Consider a 3-D tensor x∈rn1xn2xn3 \mathcal{x} \in \mathbb{r}^{n_1 \times n_2 \times n_3}, and meet x=l0+s0 \mathcal{x} = \mathcal{l }_0 + \mathcal{s}_0, where L0 \mathcal{l}_0 is a low-rank part and S0 \mathcal{s}_0 is a sparse part. There is no possibility of recovering two parts at the same time. In this paper, the author proves that the low rank and sparse parts can be recovered precisely under certain suitable assumptions, and by solving a convex optimization problem, the objective function is a kernel norm and a weighted sum of a ℓ1 \ell_1 norm, i.e.
Minl,s | | l| | ∗+λ| | s| |  1, S.T. X=l+s, \begin{equation} \min_{\mathcal{l}, \mathcal{s}} \ | | \mathcal{l} | | _* + \lambda | | \mathcal{s} | | _1, \ \ \text{s.t.} \ \ \mathcal{x} = \mathcal{l} + \mathcal{s}, \end{equation}
where Λ=1/max (n1,n2) N3−−−−−−−−−−−√\lambda = 1/\sqrt{\max (n_1, n_2) N_3}. In addition, TRPCA in n3=1 n_3 = 1 o'clock is a simple elegant extension of the two-dimensional RPCA.

Introduction

The problem of exploring low-dimensional structures in high-dimensional data has become increasingly important in image, text, and video processing, including web searches, which are found in very high-dimensional data spaces. The classic PCA 1 is widely used in statistical tools for data analysis and dimensionality reduction. Its calculation is very efficient for small noise-destroying data. However, the biggest problem with PCA is that the observation of serious damage or divorced values is fragile, which is ubiquitous in real data. Although many of the improved versions of PCA are presented, they all have a very high computational cost.

Recently proposed robust PCA 2 is the first algorithm to achieve polynomial time, and has a strong performance guarantee. Suppose given a data matrix X∈rn1xn2 \mathbf{x} \in \mathbb{r}^{n_1 \times n_2}, which can be decomposed into

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.