[Moses notes] compiling Moses decoder with NPLM

Source: Internet
Author: User

ACL2014 's best paper Fast and robust neural Network Joint Models to statistical machine translation uses deep learning knowledge in SMT, and presents a Neura L Network Joint Model (in fact, the fusion of source-end language knowledge to do language models), the results of the experiment presented in this paper claimed to greatly improve the previous translation performance based on N-gram language model translation system.


The Moses of the Open source machine translation system is also implemented and integrated. See [1]

An implementation of Devlin et al., a neural network language model, uses a Target-side history as well as source-side Context, is implemented in Moses as BILINGUALLM. It uses nplm as back-end (check its installation instructions).  

But Moses himself named him bilingual neural LM (bilingual neural Network language model).


As with other language models, Moses treats it as a translation feature (feature) into its own decoder. The integration of features is described in [1]. But if impatient, just change the Moses configuration file Mose.ini, in the training and decoding phase, using the previous common Moses decoder (usually using KENLM or srilm as a language model), the Moses decoder will not be able to identify the error feature.


See [1], the correct procedure is:

1. Install the Neural Network Language Model tool (a Neural network language Model Toolkit) NPLM.

2. Take parameters at compile time

    • --with-nplm=<root dir of the NPLM toolkit>


But I encountered a problem during the installation process:

unable to load Boost.Build:could not find" Boost-build.jam "
Attempted search from/path/to/mosesdecoder up to the root at/path/to/mosesdecoder/share/boost-build. And in these directories from Boost_build_path and boost_root:/usr/share/boost-build


At first, I've been thinking of the Boost library installation error. Refer to [2],[3] and so on, to boost a burst of toss. Without any effect ....



Reference:

[1] Http://www.statmt.org/moses/?n=FactoredTraining.BuildingLanguageModel#ntoc38

[2] http://comments.gmane.org/gmane.comp.nlp.moses.user/6322

[3] http://comments.gmane.org/gmane.comp.nlp.moses.user/11461

Copyright NOTICE: This article for Bo Master original article, without Bo Master permission not reproduced.

[Moses notes] compiling Moses decoder with NPLM

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.