Literature | 2010-2016 most cited in-depth study paper (revised edition)

Source: Internet
Author: User
Tags scale image theano

Originally from: http://blog.csdn.net/u010402786/article/details/51682917

First, the book

Deep Learning (2015)

Bengio

: http://www.deeplearningbook.org/

Second, the theory

1. Extracting knowledge in a neural network

Distilling the knowledge in a neural network

G. Hinton et al.

2. Deep neural networks are gullible: high-reliability predictions unrecognized images

Deep neural Networks is easily fooled:high confidence predictions for unrecognizable images

A. Nguyen et al.

3. What is the mobility of deep neural network features?

How transferable is features in deep neural networks? (2014),

J. Yosinski et al.

4. Details of the deep-digging convolutional network

Return of the Devil in the details:delving to Convolutional Nets (2014)

K. Chatfield et al.

5. Why is unsupervised pre-training helpful for deep learning?

Why does unsupervised pre-training help deep Learning (2010)

D. Erhan et al. (Bengio)

6. Understanding the difficulty of training depth feedforward neural network

Understanding the difficulty of training deep feedforward Neural Networks (2010)

X. Glorot and Y. Bengio

Third, optimization/network structure

Introduction: This section from the literature 7 to 14 for neural network optimization of some methods, in particular, the batch normalization of the literature 7 is a huge impact in the industry, literature 15 to document 22 is the network structure changes, including the full convolutional neural network. These references are very valuable for reference to dry goods!

7.Batch Normalization algorithm: accelerates the training of deep networks by reducing internal covariance conversions (recommended)

Batch normalization:accelerating Deep Network Training by reducing Internal covariate Shift (2015)

S. Loffe and C. Szegedy (Google)

8.Dropout: A simple way to prevent neural networks from overfitting

Dropout:a simple-to-prevent neural networks from overfitting (2014)

N. Srivastava et al. (Hinton)

9.Adam: A method of stochastic optimization

Adam:a Method for stochastic optimization (2014)

D. Kingma and J. Ba

10. On the importance of initialization and momentum in the field of deep learning

On the importance of initialization and momentum in deep learning (2013)

I. Sutskever et al. (Hinton)

11. Regularization of neural networks using Dropconnect

Regularization of neural networks using Dropconnect (2013)

L. Wan et al. (LeCun)

12. Random search with super-parameter optimization

Random Search for Hyper-parameter optimization (2012)

J. Bergstra and Y. Bengio

13. Deep residual learning in image recognition

Deep residual learning for image recognition (2016)

K. He et al. (Microsoft)

14. Region-based convolutional networks for precise detection and segmentation of objects

Region-based convolutional networks for accurate object detection and segmentation (2016)

R. Girshick et al. (Microsoft)

15. A deeper convolutional network

Going deeper with convolutions (2015)

C. Szegedy et al. (Google)

16. Fast R-CNN Network

Fast r-cnn (2015)

R. Girshick (Microsoft)

16. Faster R-CNN Network: Real-time object detection using area networks

Faster r-cnn:towards Real-time Object Detection with region proposal Networks (2015)

S. Ren et al.

17. Full convolution neural network for semantic segmentation

Fully convolutional Networks for Semantic segmentation (2015)

J. Long et al.

18. Deep convolutional networks for large-scale image recognition

Very deep convolutional networks for large-scale image recognition (2014)

K. Simonyan and A. Zisserman

19.OverFeat: Use convolutional network fusion to identify, localize, and detect

overfeat:integrated recognition, localization and detection using convolutional networks (2014)

P. sermanet et al. (LeCun)

20. Visualization and understanding of convolutional networks

Visualizing and understanding convolutional networks (2014)

M. Zeiler and R. Fergus

21.Maxout Network

Maxout Networks (2013)

I. Goodfellow et al. (Bengio)

22.Network in Network deep networking architecture

Network in Network (2013)

M. Lin et al.

Four, the image

1. Use convolutional neural networks to read text in the natural environment

Reading text in the wild with convolutional neural Networks (2016)

M. Jaderberg et al. (DeepMind)

2.Imagenet Large-scale visual identity challenge

Imagenet large scale visual recognition Challenge (2015)

O. Russakovsky et al.

3.DRAW: A recurrent neural network for image generation

DRAW:A Recurrent neural Network for Image Generation (2015)

K. Gregor et al.

4. Richer feature layering for precise object detection and semantic cutting

Rich feature hierarchies for accurate object detection and semantic segmentation (2014)

R. Girshick et al.

5. Learning and migrating mid-level image characterization using convolutional neural networks

Learning and transferring mid-level image representations using convolutional neural Networks (2014)

M. Oquab et al.

6.DeepFace: Approaching human performance in facial verification tasks

Deepface:closing the GAP to Human-level performance on face verification (2014)

Y. Taigman et al. (Facebook)

V. Video/Human Behavior

1. Mass video classification using convolutional neural Networks (2014)

Large-scale video classification with convolutional Neural Networks (2014)

A. Karpathy et al. (Feifei)

2.DeepPose: Assessing human posture using deep neural networks

Deeppose:human pose estimation via deep neural networks (2014)

A. Toshev and C. Szegedy (Google)

3. Dual-stream convolution network for motion recognition in video

Two-stream convolutional networks for action recognition in videos (2014)

K. Simonyan et al.

4.3D convolutional Neural Network for human motion recognition (this article is a good deal for continuous video frames)

3D convolutional Neural Networks for human action Recognition (2013)

S. Ji et al.

5. Motion recognition with improved trajectory

Action recognition with improved trajectories (2013)

H. Wang and C. Schmid

6. Use independent subspace analysis to learn the constant temporal and spatial characteristics of the level of motion recognition

Learning hierarchical invariant spatio-temporal features for action recognition with independent subspace Analysis (2011)

Q. Le et al

Vi. Natural Language Processing

1. Using RNN encoding-decoder to learn the phrase representation to achieve statistical machine translation

Learning phrase representations using RNN Encoder-decoder for statistical Machine translation (2014)

K. Cho et al. (Bengio)

2. An convolutional neural network for sentence modeling

A convolutional Neural Network for modelling sentences (2014)

N. Kalchbrenner et al.

3. Convolutional neural Networks for sentence classification

convolutional neural Networks for sentence classification (2014)

Y. Kim

4. Stanford CORENLP Natural Language Processing tool

The Stanford CORENLP Natural Language Processing Toolkit (2014)

C. Manning et al.

5. Recursive depth network model based on Affective Tree Library applied to affective combination research

Recursive deep models for semantic compositionality over a sentiment Treebank (2013)

R. Socher et al.

6. Cyclic neural network based on language model

Recurrent neural network based Language model (2010)

T. Mikolov et al.

7. Automatic speech Recognition: A method of deep learning

Automatic Speech recognition-a Deep Learning Approach (book, 2015)

D. Yu and L. Deng (Microsoft)

8. Using a deep loop network for speech recognition

Speech recognition with deep recurrent neural Networks (2013)

A. Graves (Hinton)

9. Application of deep neural network based on context pre-training in speech recognition of large-scale thesaurus

Context-dependent pre-trained deep neural networks for large-vocabulary speech recognition (2012)

G. Dahl et al.

10. Acoustic modeling using a depth belief network

Acoustic modeling using deep belief networks (2012)

A. Mohamed et al. (Hinton)

Vii. Unsupervised Learning

1. Self-coded variable Bayesian

Auto-encoding variational Bayes (2013)

D. Kingma and M. Welling

2. Building high-level features with large-scale unsupervised learning

Building high-level features using large scale unsupervised learning (2013)

Q. Le et al.

3. Analysis of single-layer network in unsupervised feature learning

An analysis of Single-layer networks in unsupervised feature learning (2011)

A. Coates et al.

4. Stack de-noising decoder: Learn useful representations in the local noise reduction standard deep Network

Stacked denoising autoencoders:learning useful representations in a deep network with a local denoising criterion (2010)

P. Vincent et al. (Bengio)

5. Practice Guide for training restricted Boltzmann machines

A Practical Guide to training restricted Boltzmann machines (2010)

G. Hinton

Viii. Open Source Architecture

1.TensorFlow: Large-scale machine learning on heterogeneous distributed systems

Tensorflow:large-scale machine learning on heterogeneous distributed Systems (2016)

M. Abadi et al. (Google)

2.Theano: A python framework for fast computational mathematical expression formulas

THEANO:A Python Framework for fast computation of mathematical expressions

R. Al-rfou et Al. (Bengio)

3.MatConvNet: convolutional neural network for MATLAB

Matconvnet:convolutional Neural Networks for MATLAB (2015)

A. Vedaldi and K. Lenc

4.Caffe: Fast feature-embedded convolution structure

Caffe:convolutional Architecture for Fast feature embedding (2014)
Y. Jia et al.

Ix. 2016 Latest papers

1. Opposing Learning Inference

Adversarially learned Inference (2016)

V. Dumoulin et al.

2. Understanding convolutional Neural Networks

Understanding convolutional Neural Networks (2016)

J. Koushik

3.SqueezeNet Model: Achieves AlexNet level accuracy, but uses 50 times-fold reduction in parameters and model size of < 1MB

Squeezenet:alexnet-level accuracy with 50x fewer parameters and< 1MB model size (2016)

F. Iandola et al.

4. Learning to build Quiz neural network

Learning to Compose neural Networks for Question answering (2016)

J. Andreas et al.

5. Learn eye-hand coordinated robotic gripping with deep learning and large-scale data collection

Learning Hand-eye coordination for robotic grasping with deep learning and large-scale Data Collection (Google)

S. Levine et al.

6. Isolate people: Bayesian optimization algorithm review

Taking the human out of the loop:a review of Bayesian optimization (2016)

B. Shahriari et al.

7.Eie: Efficient inference engine for compressed neural networks

Eie:efficient inference engine on compressed deep neural network (2016)
S. Han et al.

8. Calculation time of the self-adaptability of cyclic neural networks

Adaptive computation time for recurrent neural Networks (2016)

A. Graves

9. Pixel Loop Neural network

Pixel Recurrent neural Networks (2016)

A. van den Oord et al. (DeepMind)

10.LSTM: Odyssey Tour of a search space

LSTM:A Search Space Odyssey (2016)

K. Greff et al.

Note: All articles of this blog download link: http://pan.baidu.com/s/1pL1UWRp
Password: 3KFJ

Literature | 2010-2016 most cited in-depth study paper (revised edition)

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.