RNNLM Toolkit is a language model tool based on the cyclic neural network (recurrent neural Networks), the original version address is http://rnnlm.org, now provides the corresponding version under Windows, Contains a version of MinGW and a vs2010 version.
:
Baidu Network disk: Http://pan.baidu.com/s/1dDiqYTr
Micro disk: Http://vdisk.weibo.com/s/urVxq0j_jHgCu
From the simple test results, MinGW version of the speed is much faster, it is recommended to use MinGW compilation, MinGW installation and configuration can refer to another blog http://www.cnblogs.com/toofooltosaysmth/p/3890300.html.
MinGW version of the compilation method: After the configuration of MinGW, in the terminal into the MinGW directory, enter "make", the development environment with personal preferences can choose Netbean or Eclipse.
VS2010 version: You need to install Visual Studio 2010, and then double-click to open the project file.
The example directory is a sample script for testing, which also includes two versions of their own compiled executables.
A follow-up plan describes the algorithms in RNNLM and some of the techniques for setting up parameters in training, and plans to use Swig to make a Python wrapper for the current C + + code to facilitate Python enthusiasts.
Should be updated in the blog within the next 1-2 weeks.
Forget words
[Email protected]
http://weibo.com/u/3226980174/