1 Machine Intelligence
Natural language processing more than 60 years of development process, basically can be divided into two stages. The early more than 20 years, from the the 1950s to the 70 's, is the stage for scientists to detour. Limitation: Simulating the human brain with a computer. Until the 1970s, methods based on mathematical models and statistics were found.
A grammar parser is not able to handle complex statements. First, the number of grammatical rules is at least tens of thousands of, if you want to cover even 20% of the true statement through grammatical rules. Second, even if you can write a set of grammatical rules that cover all natural language phenomena, it is difficult to parse it with a computer. 2 from rule to statistic
The processing of semantics has encountered even greater trouble. The ambiguity of natural language morphemes is difficult to describe by rules, but depends on context, even world knowledge.
Transformation: Friedrich Wilderness Jarikni and his IBM Watson Lab
The natural language rule and the statistical controversy lasted for 15 years, and the reason: First, a new approach to research was ripe for many years. Secondly, the use of statistical-based methods instead of traditional methods, the need for the original number of linguists retired.