The difference between compression perception and deep learning
Source: Internet
Author: User
is essentially two questions. If you must find a connection, both involve sparse representation of the data.
Compression-aware solution to "inverse problem": ax=b. For a given linear system, if the known solution is sparse (sparsity), sparsity can be used as a constraint or a regular term to provide additional prior information. The application of linear inverse problem and sparsity in this kind of problem has a relatively complete theoretical system, and the Book of Michael Elad recommended by Yang Liu Upstairs is a very good introductory material.
Another closely related problem is the low Rank matrix recovery (Low-rank matrix recovery), which uses Low-rank as a priori knowledge to solve the linear inverse problem of matrices and develop a set of theories.
The idea of compression perception has been applied in more fields, such as nonlinear inverse problems. The relevant theories are developing rapidly, but the application is one step ahead. I am personally interested in bilinear inverse problems (bilinear inverse problem), such as Blind deconvolution (blind deconvolution), matrix decomposition (matrices factorization).
In applying the compression perception process, we find that most of the signals themselves are not sparse (that is, the expression in the natural base is not sparse). But after a proper linear transformation is sparse (that is, the other group of bases (basis) or frames (frame, I do not know how to translate) are sparse). such as harmonic extraction (harmonic retrieval), the time domain signal is not sparse, but in the Fourier domain signal is sparse. For example, most natural images are not sparse, but the sparse expression can be obtained by DCT (discrete remainder transformation) or wavelet transform (wavelet transform). A once-popular research topic is Dictionary Learning (Dictionary Learning) and Transformation Learning (Transform learning), which adaptively learns sparse expression through a large number of signal instances.
Deep learning is a means of machine learning, see upstairs Stephen Wang's explanation. In deep learning, non-linear links are often involved. The purpose of this data expression is usually no longer data recovery (recovery), but the task of machine learning such as classification (classification).
Here are the differences I understand:
Sparse expression Learning in signal processing (sparse representation learning) focuses on modeling signals, that is, the goal is to acquire a faithful expression of the original signal (faithful representation). We often need transformations and inverse transformations to achieve signal reconstruction (reconstruction). Even in problems that do not require reconstruction, we also need this expression to be able to distinguish meaningful signals from meaningless noises (discriminate signal against noise). So this kind of transformation usually has a lot of good properties (reversible, very good condition number (condition Numer), etc.).
In deep learning or more extensive machine learning, the target of data expression varies from problem to issue, but usually we don't need the reversibility of this expression process. For example, in the classification problem, our goal is to transform the data into a "meaningful" space, to achieve the separation of different categories of signals. Such transformations can be linear or non-linear, reversible or irreversible, and can be transformed into sparse expressions or other meaningful, easy-to-classify expressions.
The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion;
products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the
content of the page makes you feel confusing, please write us an email, we will handle the problem
within 5 days after receiving your email.
If you find any instances of plagiarism from the community, please send an email to:
info-contact@alibabacloud.com
and provide relevant evidence. A staff member will contact you within 5 working days.