Find information on the Internet:
http://www.shareditor.com/bloglistbytag/?tagname=%E8%87%AA%E5%B7%B1%E5%8A%A8%E6%89%8B%E5%81%9A%E8%81%8A%E5%A4% A9%e6%9c%ba%e5%99%a8%e4%ba%ba
I'm glad that this information is too comprehensive. Pay tribute to Science.
http://www.shareditor.com/blogshow?blogId=136 do-it-Yourself Chat Robot 42-(heavyweight) from theory to practice to develop their own chat robot
At present, these models are the most famous, you can read it sometime.
What is cyclic neural network and lstm
This article can refer to my own do-it-yourself chat robot 26-Graphical recursive neural network (RNN), or directly to see Christopher Olah http://colah.github.io/posts/2015-08- understanding-lstms/, this blog has been cited countless times by the industry, Classic Classic.
What is the SEQ2SEQ model
SEQ2SEQ is a sequence to the sequence model based on the cyclic neural network, the language translation, the automatic question and answer all belong to the sequence to the sequence scene, all may use the SEQ2SEQ model, uses the SEQ2SEQ realization Chat robot The principle may see this article http:// suriyadeepan.github.io/2016-06-28-easy-seq2seq/. TensorFlow has implemented a good API for us to use, but because of more parameters, the principle of complex, understand more obscure, this article let me take you step-by-step exploration and use.
What is the attention model
The attention Model (attentional model) is to solve the problem that the decoder in Seq2seq only accepts the last output of the encoder and away from the previous output, and in principle an answer is based on the information of some key positions in the problem, that is, where attention is concentrated, Details can be seen under http://www.wildml.com/2016/01/attention-and-memory-in-deep-learning-and-nlp/
make your own chat robot with TensorFlow Seq2seq.
Here, assuming you have mastered the theoretical section above, now we are directly locked in the TensorFlow provided to us by the powerful API. To really take advantage of a good tensorflow must understand its important interface and all its parameters, so the first step is to find the most critical interface we'll use this time: https://www.tensorflow.org/api_docs/python/tf/ Contrib/legacy_seq2seq/embedding_attention_seq2seq
)