Natural language Processing PJ Outline

Source: Internet
Author: User
Tags arithmetic nltk

Do it in two parts. The first part is lossless text compression, the second part is sentence level text summarization, called lossy text compression.

Do not send too high expectations to the second part, because the big probability is not finished, after all, I have no contact with this field.

Lossless text Compression

Overall introduction. The internet produces too much text (is it a pseudo proposition?) Storage and propagation is not economical if compression is not performed. At the time of installing the NLTK corpus, nearly 300M of text was also downloaded for a long time. (More examples, net text, etc., explaining the meaning of text compression)

Information. To have the composition of mathematics, the main description, coding and information entropy. This paper introduces Havermann coding, which is used to calculate the compression rate of the brown corpus characters and word compression by Havermann coding. Describes the arithmetic encoding. Demonstrate the superiority of arithmetic coding.

A brief overview of data compression. Description data compression = encode + model. Then it shows that coding is a problem that has been solved, and the model is more of an AI problem. Refer to the book of the Don

Ppm. Use Python to reproduce a ppm to compress the brown corpus (or the entire corpus so that it compares to the 300M 233). Notice the use of the NLTK in the function, with N-gram instead of the above, using freqdist to do statistics. (Sense of complexity to explode)

Introduce text prediction, LSTM. Refer to the multi-data, the existing Stanford one, and then look at the practice of Cmix and PAQ8, this research should be done by the predecessors. Do benchmark, such as press ENWIKI8. To be able to do state of art is flattered.

Several implementations of LSTM

Https://github.com/kedartatwawadi/NN_compression This is like Stanford that guy ...

Https://github.com/byronknoll/lstm-compress Cmix Realization, unexpectedly is the handwriting reverse dissemination ... Orz

lossy text compression

Follow sentence level text summarization.

Two aspects, on the one hand is the traditional method, attempts to analyze the sentence structure, uses the already annotated corpus, the simplification sentence structure.

On the other hand is the neural network of the attention model. Reference: Text summary-text summarization when deep learning meets automatic text summary

Natural language Processing PJ Outline

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.