cs231n Note Lecture, Recurrent neural Networks

Source: Internet
Author: User

Recaption on CNN Architecture

Although Serena is very beautiful, and Justin is a better lecturer. Love him.

Recurrent neural Network

Meant to process sequencial data, reuse hidden state to retain the knowledge of the previous Fed inputs. Can is use with "one to many", "many to one" and "many to many" scenarios by using different input and output stradegies. Formally, we maintain an $h _t$ for TTH iteration, and generate next hidden state by applying $h _{t+1}=f_{w} (h_t, x_{t+1}) $ Where we reuse the parameters $W $ and get updated through flowing gradients, and get our output from hidden state. A Vanilla RNN Looks like this: $h _t=\tanh (w_{hh}h_{t-1} + w_{xh}x_t), y_t = w_{hy}h_t$. During training, we backprop the loss through $y _{t}$ and update $W $, for long sequence, this'll leads to heavy memory us Age (for maintaining the hidden state, as I has experienced), so we chunk the input and update after each chunk.

A very insightful and concrete discussion comes from Andrej karpathy, the unreasonable effectiveness of recurrent neur Al Networks and he provided a sheet of practical RNN Code on GIST (a non-exsitent site in China:-(), it consists O nly lines of Python.

Optimization:allows for multiple layers, uses an LSTM instead of a vanilla RNN, have more supporting code for model CHECKP Ointing, and is the course much more efficient since it uses mini-batches and can run on a GPU.

Image captioning

Andrej ' s dissertation paper covers the interesting part of combining image and language, check it out!

The structure are to remove the last output layers of a convnet (say, the FC-1000 and Softmax of a VGG16), and the Result as the hidden state of a RNN by a $W _{ih}$, start with <START> tokens, keep sampling the out put $y _t$ and PA SS the previous output to the next input, until we get a <END> token.

Image captioning with Attention

This is a slightly the advanced model. First, the CNN produces features instead of sumarization vector (as previously do), and the RNN is allowed to steer its a Ttention on the features (makes them as weighted input), Second, the RNN produces 2 output at each time step, one for Atte Ntion and one for words.

Visual Question Answering

RNN allows-solve some very fancy problems, it is cool!

Multilayer RNN, LSTM and GRU

Deeper models is better, so, stacking layers spacially to get multilayer RNN.

Vanilla RNN Gradient Flow. Leads to exploding gradients (Solved by Gradient clipping, if L2 norm of Gradient to large,. E.g bigger than some threshol D, then clip it to this threshold) or vanishing gradients (Solved by fancier architecture).

LSTM [Hochreiter and Schmidhuber, 1997]

Op in the lecture means element wise multiplication. Note the different non-linearity used and the size of the matrix above.

Gradient flow. Uninterrupted, because of C.

Metaphysical structure:

GRU [Learning phrase representations using RNN encoder-decoder for statistical machine translation, Cho et al. 2014]

[Lstm:a Search Space Odyssey, Greff et al., 2015]

[An empirical exploration of recurrent Network architectures, Jozefowicz et al., 2015]

Key idea:manage gradient flow.

cs231n Note Lecture, Recurrent neural Networks

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.