"Attention are all Need" (Introduction + code)

Source: Internet
Author: User


In 2017 years, there were two articles similar to the one I also admired, namely Facebook's "convolutional Sequence to Sequence learning" and Google's "Attention are all need" , they are seq2seq on the innovation, in essence, are abandoning the RNN structure to do seq2seq task.

In this blog post, the author makes a simple analysis of "Attention is all need". Of course, the two papers themselves are relatively fire, so there is a lot of interpretation on the Internet (but many of the interpretation is directly translated papers, few of their own understanding), so here as much of their own text, try not to repeat the online you have said the content of the big guy. I. Sequence encoding

Deep learning to do the NLP method, basically is the sentence participle first, and then each word into the corresponding word vector sequence. In this way, each sentence corresponds to a matrix x= (x1,x2,..., xt) x= (x1,x2,..., xt), where Xi Xi represents the word vector (line vector) of the word I I, and the dimension is D D, so x∈rnxd x∈rnxd. In this case, the problem becomes encoded in these sequences.

The first basic idea is the RNN layer, the RNN scheme is very simple, recursive type:
Yt=f (YT−1,XT) yt=f (YT−1,XT)
Whether the lstm, GRU or the recent SRU, which has been widely used, is not divorced from the recursive framework. The RNN structure itself is relatively simple and suitable for sequence modeling, but one of the obvious drawbacks of RNN is that it cannot be parallel, so the speed is slow, which is a natural defect of recursion. In addition, I personally feel that RNN can not learn the overall structure of the information very well, because it is essentially a Markov decision-making process.

The second idea is the CNN layer, in fact, CNN's plan is also very natural, window-type traversal, such as the size of 3 convolution, is

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.