Stanford UFLDL Tutorial Exercise:sparse Coding

Source: Internet
Author: User
Exercise:sparse Coding
Contents [Hide] 1Sparse Coding 1.1Dependencies 1.2Step 0:initialization 1.3Step 1:sample patches 1.4Step 2:implement and check S Parse coding cost functions 1.5Step 3:iterative optimization
Sparse Coding

In this exercise, you'll implement sparse coding and topographic sparse coding on black-and-white natural images.

The file Sparse_coding_exercise.zip we have provided some starter code. You should write your code at the places indicated ' your code here ' in the files.

For the exercise, you'll need to modify sparsecodingweightcost.m,sparsecodingfeaturecost.m and SPARSECODINGEXERCISE.M. Dependencies

You'll need:computenumericalgradient.m from Exercise:sparse autoencoder display_network.m from Exercise:sparse Autoencoder

If you had not completed the exercise listed above, we strongly suggest your complete it first. Step 0:initialization

In this step, we initialize some parameters used for the exercise. Step 1:sample Patches

In this step, we sample some patches from the Images.mat dataset comprising black-and-white pre-whitened natural IMAGES . Step 2:implement and check sparse coding cost functions

In this step, you should implement the and the sparse coding cost functions:sparsecodingweightcost in Sparsecodingweightcost. M, which is used for optimizing the weight cost given the features Sparsecodingfeaturecost in SPARSECODINGFEATURECOST.M, W Hich is used for optimizing the feature cost given the weights

Each of these functions should compute, the appropriate cost and gradient. wish to implement the Non-topographic version Ofsparsecodingfeaturecost first, ignoring the grouping matrix and as Suming that none of the features is grouped. You can then extend this to the topographic version later. Alternatively, implement the topographic version directly-using the non-topographic version would then involve SE Tting the grouping matrix to the identity matrix.

Once you has implemented these functions, you should check the gradients numerically.

Implementation Tip -gradient checking the feature cost. One particular point-to-note is if checking the gradient for the feature Cost,epsilon should was set to a larger Val UE, for instance 1e-2 (as have been do for your in the checking code provided), to ensure that checking the gradient numer Ically makes sense. This is necessary because Asepsilon becomes smaller, the function of sqrt (x + epsilon) becomes "sharper" and more "pointed", Making the numerical gradient computed near 0 less and less accurate. To see this, consider what would happen if the numerical gradient is computed by using a point with x less than 0 and a P Oint with x greater than 0-the computed numerical slope would is wildly inaccurate. Step 3:iterative Optimization

In this step, you'll iteratively optimize for the weights and features to learn a basis for the data, as described in th E section onsparse Coding. Mini-batching and initialization of the features S has already been do for you. However, need to still need to fill in the analytic solution to the optimization problem with respect to the weigh T matrix, given the feature matrix.

Once that's done, you should check that your solution are correct using the given checking code, which checks that the GRA Dient at the point determined by your analytic solution are close to 0. Once your solution have been verified, comment out the checking code, and run the iterative optimization code. Iterations should take less than minutes to run, and by the iterations you should being able to see bases the look Li Ke edges, similar to those you learned inthe sparse autoencoder exercise.

For the Non-topographic case, these features won't be "ordered", and would look at something like the following:

For the topographic case, the features would be "ordered topographically", and would look at something like the following:

From:http://ufldl.stanford.edu/wiki/index.php/exercise:sparse_coding

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.