(unfinished) not completed
First, the description about the LSTM cell structure and some calculations have been introduced before, you can click here to view this blog is mainly about content: Lstm forward calculation instructions (the previous blog in the lstm part of the actual already mentioned, here in conjunction with the map more detailed description)
two, LSTM forward calculation Step by step
1, structure review We know that the structure of RNN is as follows: The neurons in the cell can have multiple
LSTM is the improvement of cell structure
Symbol description
The key of LSTM is state, which corresponds to the transmission of the main line data.
2. Forward calculation Step by step
(1) decision to discard the information forgotten Gate (Forget gate layer) σ is the sigmoid excitation function, because its range is (0,1), 0 represents forgetting all information, 1 means keeping all information
(2) decide to store new information including two parts the first is the input gate (input gate layer), corresponding to the sigmoid function, the second is the tanh excitation function.
(3) Update statect−1 into Ct ft is through the sigmoid function, so the number of ranges between (0,1), ct−1 point by 0-1 is actually a scaling of ct−1, (can be thought to remember the extent of the information before) and then add in the new information
(4) Finally compute the output gate layer