Stanford Recommended Reading list

Source: Internet
Author: User
Tags theano

Stanford recommended Reading directory Stanford Deep Learning Web site recommended reading directory: UFLDL Recommended Readings

If you ' re learning about UFLDL (unsupervised Feature learning and deep learning), this is a list of the papers to consider Rea Ding. We ' re assuming you ' re already familiar and basic machine learning at the level of [CS229 (lecture notes available)].

The Basics:

    • [cs294a] Neural Networks/sparse autoencoder Tutorial. (Most of this is now in the Ufldl Tutorial, but the exercise is still on the cs294a website.)
    • [1] Natural Image Statistics book, Hyvarinen et al.
      • This is a long, so just skim or skip the chapters it already know.
      • Important Chapters:5 (PCA and whitening; you ' ll probably already know the PCA stuff), 6 (sparse coding), 7 (ICA), ten (ISA ), one (TICA), (temporal models).
    • [2] Olshausen and Field. Emergence of Simple-cell receptive field properties by learning a sparse code for natural Images Nature 1996. (Sparse Coding)
    • [3] Rajat Raina, Alexis Battle, Honglak Lee, Benjamin Packer and Andrew Y. Ng. Self-taught Learning:transfer Learning fro M unlabeled data. ICML 2007


Autoencoders:

    • [4] Hinton, G. E. and Salakhutdinov, R. R. Reducing the dimensionality of data with neural networks. Science 2006.
      • If you want to play with the code, you can also find it at [5].
    • [6] Bengio, Y, Lamblin, p., Popovici, p., Larochelle, H. Greedy layer-wise Training of deep Networks. NIPS 2006
    • [7] Pascal Vincent, Hugo Larochelle, Yoshua Bengio and Pierre-antoine Manzagol. Extracting and composing robust Features with denoising autoencoders. ICML 2008.
      • (They has a nice model, and then backwards the rationalize it into a probabilistic model. Ignore the backwards rationalized probabilistic model [Section 4].)


Analyzing deep learning/why does deep learning work:

    • [8] H. Larochelle, D. Erhan, A. Courville, J. Bergstra, and Y. Bengio. An empirical Evaluation of the deep architectures in problems with many factors of variation. ICML 2007.
      • (Someone read this and let us know if the is worth keeping,. [Most model related material already covered by other papers, it seems not many impactful conclusions can be made from res Ults, but can serve as reading for reinforcement in deep models])
    • [9] Dumitru Erhan, Yoshua Bengio, Aaron Courville, Pierre-antoine Manzagol, Pascal Vincent, and Samy Bengio. Why Does unsupervised pre-training help deep learning? JMLR 2010
    • Ian J. Goodfellow, Quoc v. Le, Andrew M. Saxe, Honglak Lee and Andrew Y Ng. Measuring invariances in deep networks. NIPS 2009.


RBMs:

    • [One] Tutorial on RBMs.
      • But ignore the Theano code examples.
      • (Someone tell us if this should is moved later. Useful for understanding some of DL literature, and not needed for many of the later papers? [Seems ok to leave in, useful introduction if reader had no idea about RBM ' s, and has to deal with Hinton ' s science pa Per or 3-way RBM ' s right away])


Convolution Networks:

    • [Tutorial] on convolution neural Networks.
      • But ignore the Theano code examples.


Applications:

    • Computer Vision
      • [] Jianchao Yang, Kai Yu, Yihong Gong, Thomas Huang. Linear Spatial Pyramid Matching using Sparse Coding for Image classification, CVPR 2009
      • A. Torralba, R. Fergus and Y. Weiss. Small codes and large image databases for recognition. CVPR 2008.
    • Audio recognition
      • [Unsupervised] feature learning for audio classification using convolutional deep belief networks, Honglak Lee, Yan Lar Gman, Peter Pham and Andrew Y. Ng. In NIPS 2009.


Natural Language Processing:

    • [+] Yoshua Bengio, Réjean Ducharme, Pascal Vincent and Christian Jauvin, A neural probabilistic Language Model. JMLR 2003.
    • R. Collobert and J. Weston. A Unified Architecture for Natural Language processing:deep neural Networks with multitask learning. ICML 2008.
    • Richard Socher, Jeffrey Pennington, Eric Huang, Andrew Y Ng, and Christopher D. Manning. semi-supervised Recursive autoencoders for predicting sentiment distributions. EMNLP 2011
    • Richard Socher, Eric Huang, Jeffrey Pennington, Andrew Y Ng, and Christopher D. Manning. Dynamic Pooling and unfolding Recursive autoencoders for paraphrase Detection. NIPS 2011
    • [C] Mnih, A. and Hinton, G. E. E. Three New graphical Models for statistical Language modelling. ICML 2007


Advanced Stuff:

    • Slow Feature Analysis:
      • [Slow] Feature analysis yields a rich repertoire of complex cell properties. Journal of Vision, 2005.
    • Predictive Sparse decomposition
      • [Koray] Kavukcuoglu, Marc ' Aurelio Ranzato, and Yann LeCun, "Fast inference in Sparse Coding algorithms with Application s to Object recognition ", computational and biological learning Lab, Courant Institute, NYU, 2008.
      • Kevin Jarrett, Koray Kavukcuoglu, Marc ' Aurelio Ranzato, and Yann LeCun, "What's the best multi-stage Architecture fo R Object recognition? ", in ICCV 2009


Mean-covariance Models

  • M. Ranzato, A. Krizhevsky, G. Hinton. Factored 3-way Restricted Boltzmann machines for Modeling Natural Images. In Aistats 2010.
  • M. Ranzato, G. Hinton, Modeling Pixel Means and covariances Using factorized third-order Boltzmann machines. CVPR 2010
    • (Someone and tell us if you need to read the 3-way RBM paper before the MCRBM one [I-didn ' t find it necessary, in fact the CVPR paper seemed easier to understand.])
  • [+] Dahl, G., Ranzato, M., Mohamed, A. and Hinton, G. E. Phone recognition with the mean-covariance Restricted Boltzmann Machine. NIPS 2010.
  • Y. Karklin and M. S. Lewicki, Emergence of complex cell properties by learning to generalize in natural scenes, Natur E, 2008.
    • (Someone tell us if this should is here.) Interesting algorithm + Nice visualizations, though maybe slightly hard to understand. [Seems a good reminder there is other existing models])


Overview

    • [Yoshua] Bengio. Learning deep architectures for AI. FTML 2009.
      • (Broad Landscape Description of the field, but technical details there is hard-to-follow so ignore. This is also easier-to-read after you ' ve gone over some of literature of the field.)


Practical Guides:

    • [] Geoff Hinton. A Practical Guide to training restricted Boltzmann machines. Utml TR 2010–003.
      • A Practical Guide (read if you ' re trying to implement and RBM, but otherwise skip since the is not really a tutorial).
    • [j] Y. LeCun, L. Bottou, G. Orr and K. Muller. Efficient Backprop. Neural networks:tricks of the trade, Springer, 1998
      • Read If you ' re trying to run Backprop; But otherwise skip since very low-level engineering/hackery tricks and isn't that satisfying to read.


Also, for other lists of papers:

    • [To] Honglak Lee ' s Course
    • From Geoff ' s tutorial

Stanford Recommended Reading list

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.