Recurrent neural Network Language Modeling Toolkit by Tomas Mikolov Use example

Source: Internet
Author: User

Recursive neural Network language Model tool address: http://www.fit.vutbr.cz/~imikolov/rnnlm/


1. Simple use of tools

Tools are: rnnlm-0.3e

Step1. Files extracted, extracted after the file is:


Figure 1.rnnlm-0.3e the extracted file


Step2. Compiling tools

Command:

Make clean

Make

Could be an error saying this x86_64-linux-g++-4.6 command can't be found.

If the above error occurs, simply change the first line of the makefile file CC = x86_64-linux-g++-4.6 to CC = g++

After compiling, the following new files are generated


Figure 2. After compiling, there are rmmlm executable files generated


step3. Executive RNNLM

The example.sh file has executed commands in this simple copy to execute and see the effect.


Figure 3. RNN command to execute


Output after executing the command:


Figure 4. Output after execution of RNNLM

There are a bunch of parameters in the output, it is unclear what the use of these parameters, and then make sure to make up again.


At this point, the RNNLM runs and the model files are generated. Such as:


Figure 5. RNN model File

The model is a parameter file that trains RNNLM, and Model.output.txt is the performance of models on the validation set (valid dataset).


* * The train valid test data included in the toolkit contains 10000, 999, and 1000 English sentences, respectively. Corpus size is very small.


Step4. Test set test with a well-trained model

command./RNNLM-RNNLM model-test Test


Figure 6. The performance of a well-trained model on a test set


The following is a comparison between the RNNLM and the normal N-gram.

step5. Train normal N-gram model[the Srilm tools here need to be installed yourself]

Command: Ngram-count-text train-order 5-lm templm-kndiscount-interpolate-gt3min 1-gt4min 1


Figure 7. The trained 5-dollar N-gram model named TEMPLM


step6. Testing test with the trained TEMPLM

Command: NGRAM-LM templm-order 5-ppl test-debug 2 > Temp.ppl

After the test is finished, there is a TEMP.PPL file generated in the face directory. TEMP.PPL file record TEMPLM performance on test


Figure 8. The last 2 lines of the Temp.ppl file.


**example also has the convert command:

GCC Convert.c-o2-o Convert
./convert <TEMP.PPL >srilm.txt

The Srilm.txt file in this step is required for subsequent STEP9 execution


* * For the convert.cc file, its source file has a simple description of its function: This easy program converts srilm output obtained In-debug 2 test mode to Raw Per-word p Robabilities.

[In the Corpus Experiment of the toolkit, comparing the model size of the RNNLM and N-gram models to the performance of the model on test, there should be some conclusions.] ]


STEP7. Weighted use of RNLM and N-gram

Command:./RNNLM-RNNLM model-test Test-lm-prob Srilm.txt-lambda 0.5

The weighted model shows up in test as follows:



About the use of RNNLM, so far, the ending .....


#我自己的语料上的表现:

Corpus statistics

Corpus Train Valid Test
Number of sentences 2500000 250000 184985
Number of words 7030251 8 9113721 6993456

Program running .....

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.