A fast method for recognizing unregistered words (principles and implementation)
Word Segmentation on the nearest networkAlgorithmThere are already many different Chinese word segmentation algorithms written by each person based on different understandings of word segmentation.
However, it seems that there is no recognition algorithm for Unlogged words yet. In view of this, I wrote a special one to inspire others.
Algorithm hypothesis:
1. Unlogged words are composed of single words;
2. If a word belongs to two unregistered words at the same time, select the first recognized word;
TestArticle:
Recently, the TV series became very popular. I chose to introduce the article of the TV series,
Address: http://www.360doc.com/showWeb/0/0/18183.aspx
The recognition results are as follows:
PDH: Initialize Phrase Dictionary
Queryspliter reinitialize dictionary.
Changjin, workplace, Min zhenghao, Han opera, Zheng yunbai, Lian Sheng, Master book, Leng Miao Gao Xiang, Yi Zi, medical girl, Zhang De, left, Jeju, election, Secretary
Algorithm principle:
First, find the word that has been segmented, and then check whether the next word is still a single word. If yes, determine the number of times this situation occurs. If it exceeds the reserved threshold, then confirm that this is a new word.
The following shows the computing process of an algorithm:
PDH: Initialize Phrase Dictionary
Queryspliter reinitialize dictionary.
>>> 8, 9; 9, 10
Changjin
>>> 237,238; 238,239
Workplace
>>> 595,596; 596,597; 597,598
Min zhenghao
>>> 189,190; 190,191
Korean Drama
>>> 1111,1112; 1112,1113; 1113,1114
Zheng yunbai
>>> 599,600; 600,601
Lian Sheng
>>> 610,611; 611,612
Master book
>>> 975,976; 976,977; 977,978; 978,979
Cool Temple
>>> 1233,1234; 1234,1235
Yundun
>>> 559,560; 560,561
Medical girl
>>> 561,562; 562,563
Zhang de
>>> 3114,3115; 3115,3116
Remaining
>>> 534,535; 535,536
Jeju
>>> 580,581; 581,582
Select
>>> 2071,2072; 2072,2073
Secretarial
This algorithm is based on the Word Segmentation of Xiao dingdong.
You are welcome to discuss and improve this algorithm.
Related connections:
Text Segmentation studio contribution)
When processing a large amount of data, Xiao Dingdong encountered a memory leak.
Lucene user salon
Original post address