1 Machine Intelligence
Natural language processing more than 60 years of development process, basically can be divided into two stages. The early more than 20 years, from the the 1950s to the 70 's, is the stage for scientists to detour. Limitation: Simulating the human brain with a computer. Until the 1970s, a method based on mathematical models and statistics was found.
Based on the grammar analyzer, the complex statement can not be processed. First, the number of grammar rules is at least tens of thousands of, if you want to cover 20% of real statements through grammatical rules. Second, even if you can write a collection of grammatical rules that cover all natural language phenomena, it is difficult to use computers to parse them. 2 from the rule to the statistic
The processing of semantics has encountered even greater trouble. The ambiguity of the word in natural language is difficult to describe by rules, but it depends on context and even world knowledge.
Change: Frederick Jarikni and his IBM Watson Lab
The natural language rule and the statistical controversy lasted for 15 years: first, a new approach to research is ripe for many years. Secondly, it is necessary to replace the traditional methods based on the statistical method to retire the original group of linguists.