Sparse model and structural sparse Model

Source: Internet
Author: User

Sparse Coding series:

    • (1) ---- spatial pyramid Summary
    • (2) ---- sparse representation of images -- Summary of scspm and LLC
    • (3) ---- understanding Sparse Coding
    • (4) ---- sparse model and structural sparse Model

---------------------------------------------------------------------------

Scspm and improved LLC have been mentioned several times before. SPM is non-structured sparse encoding, while LLC is structured sparse encoding. This time, I want to give a more comprehensive introduction to some sparse structured content. At the end of this article, we will provide several typical examples, including the source code (in the MATLAB version) and the PDF of the citation for your experiment.

Data Representation (not limited to images) is often based on the following minimization problem:

(1)

X indicates the feature matrix of the observed data, D indicates the dictionary, and Z indicates the dictionary description. Constraints and make the dictionary and description code have a certain structure. When D is scheduled, the process of determining Z is called representation persuit. When d and Z are unknown at the same time, it is determined that D is a problem of dictionary learning.

Sparse representation, which usually imposes constraints on Z, so that each column in Z can have only a small number of non-0 coefficients. The simplest constraint is

(2)

Then the problem becomes lasso. K-means + hard-VQ is a more rigorous sparse encoding, compared to the constraints of the L1-norm, hard-VQ introduces a serious reconstruction error, so the effect will be relatively poor. This is the content of scspm and LLC.

An important reason for Lasso's improvement by LLC is the lack of smooth. The potential reason is that non-0 elements in Z lack structure information (Unstructured Sparse Coding ). Therefore, the work of many subsequent papers is to propose a sparse model with structure. Every code word in dictionary D is called dictionary atoms. To represent a set of some codewords in D, and define all such sets as g, that is. In G, each group can overlap or not overlap (this corresponds to different group sparse models ). The constraints can be expressed:

(3)

Where is the sub-vector (only the elements in the group are taken ). It can be seen that for each group internally, the L2-norm is used. Because the L2-norm itself is not less than 0, so the group is actually a L1-norm. This constraint creates the group selection feature, that is, the group is set to 0 or not to 0. This is still not perfect, because the atom in the group cannot satisfy the sparsity. As a result, a method is proposed to add the (2) Constraint after the (3) formula to ensure the sparsity in the group, that is:

 

(4)

 

It can be seen that (3) degrades to lasso.

Some common structural sparsity are listed as follows:

Hierarchical Sparse Coding[Code | read more]: There is a hierarchy between non-zero coefficients, that is, the group and the group are either not overlap. If overlap is performed, one group must contain another group. A typical hierarchy is the tree structure.

Overlapping group Sparse Coding[Code | read more], the Relax constraint is imposed, that is, the group can overlap each other. This model is said to be very effective in the description of genetic data. You may try it.

Graph-guided Sparse Coding[Code | read more]: Create a graph. each node in the graph is the atom in the dictionary. Graph-guided is different from the group Sparse Coding above. It can add more complex structure information. The format is:

(5)

The difference is different here. Intuitively, each atom in a dictionary is regarded as a node in the graph and represents the edge weight between nodes. The weight can be used in many articles, such as semantic and structural associations between atom and atom.

Sparse model and structural sparse Model

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.