The original author sums up very well.
From NN to rnn again to Lstm (2): Brief introduction and calculation of cyclic neural network rnn
This paper will briefly introduce the cyclic neural network (recurrent neural network,rnn), and RNN forward calculation and error reverse propagation process.
Reprint please indicate the source: http://blog.csdn.net/u011414416/article/details/46709965
The following is mainly quoted from Alex Graves written supervised Sequence labelling with recurrent neural Networks book.
(http://www.springer.com/cn/book/9783642247965)
From NN to Rnn to Lstm (3): Brief introduction and calculation of long and short time memory lstm
In this paper, we will briefly introduce the gradient disappearance and the gradient explosion of RNN, and then introduce the related formulae and the derivation process of the Long time memory (length short-term memory,lstm).
Reprint please indicate the source: http://blog.csdn.net/u011414416/article/details/46724699
The following is mainly quoted from Alex Graves written supervised Sequence labelling with recurrent neural Networks book.
(http://www.springer.com/cn/book/9783642247965)