Examples of application of cyclic neural networks

Source: Internet
Author: User
Application examples of RNN--a language model based on RNN

Now, let's introduce a model based on the RNN language. We first input the word into the recurrent neural network, each input word, the recurrent neural network output so far, the next most likely word. For example, when we enter in turn:

I was late for school yesterday.

The output of the neural network is shown in the following figure:

where S and e are two special words that represent the beginning and the end of a sequence, respectively. to quantify

We know that the input and output of neural networks are vectors, in order for the language model to be processed by neural networks, we must express the word as a vector so that the neural network can handle it.

Neural network input is the word, we can use the following steps to quantify the input: to create a dictionary containing all the words, each word in the dictionary has a unique number. Any word can be represented by an n-dimensional one-hot vector. where n is the number of words contained in the dictionary. Suppose that the number of a word in the dictionary is i,v is the vector that represents the word, and VJ is the first element of the vector, then:

Vj={1j=i0j≠i (76)

The meaning of the above formula can be visually represented by the following diagram:

Using this vectorization method, we get a high-dimensional, sparse vector (sparse means that the value of most elements is 0). Dealing with such vectors will cause our neural networks to have many parameters, resulting in a huge amount of computation. Therefore, it is often necessary to use some dimensionality reduction methods to transform high-dimensional sparse vectors into low-dimensional dense vectors. But this topic is no longer discussed in this article.

The output required by the language model is the next most likely word, and we can let the recurrent neural network calculate the probability that each word in the dictionary is the next word, so that the most probable word is the next one. Therefore, the output vector of the neural network is also an n-dimensional vector, and each element in the vector corresponds to the probability that the corresponding word in the dictionary is the next word. As shown in the following illustration:

Softmax Layer

As mentioned earlier, the language model is modeled on the probability of the next word appearing. So, how to let the neural network output probability. The method is to use Softmax layer as the output layer of neural network.

Let's first look at the definition of the Softmax function:

G (Zi) =ezi∑

Large-Scale Price Reduction
  • 59% Max. and 23% Avg.
  • Price Reduction for Core Products
  • Price Reduction in Multiple Regions
undefined. /
Connect with us on Discord
  • Secure, anonymous group chat without disturbance
  • Stay updated on campaigns, new products, and more
  • Support for all your questions
undefined. /
Free Tier
  • Start free from ECS to Big Data
  • Get Started in 3 Simple Steps
  • Try ECS t5 1C1G
undefined. /

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.