"What to think" and "What to eat" is an artificial intelligence contest

Source: Internet
Author: User
Keywords Cloud computing Big Data Microsoft Google Apple data center data center

  

"What to think" and "what to eat" the two factions have been in the field of artificial intelligence two classes, two lines of struggle, this struggle is sometimes a life-and-death. Set a vulgar philosophical word, the former partial idealism, the latter biased materialism.

The hottest word in the computer field is "deep learning".

Linguist Steve Pink challenges the theory of neural networks

Turing Test opens up the battle of two routes in the field of artificial intelligence

The large Neumaichel _ Abib in the field of artificial intelligence created the computer department at the University of Massachusetts.

Since Turing put forward "machine and intelligence", there have always been two views, one faction believes that the realization of artificial intelligence must use a logical and symbolic system, which is to see the problem is top-down; there is also a belief that by mimicking the brain, artificial intelligence can be achieved, which is bottom-up, and they assume that if a machine can be built to simulate neural networks in the brain, , the machine is intelligent. The former faction, I want to use "think what to what" to describe, the latter is called "What to eat what", estimated their ideas from ancient Chinese primitive thinking, set a vulgar philosophical word, the former partial idealism, the latter biased materialism. The two factions have been the Battle of two classes and two routes in the field of artificial intelligence.

The original articles of the

Analog neural network were published in 1943, and both authors were legends, Mccarlocco (McCulloch) and Pitts (Pitts). Words at both ends. Pitts like math and philosophy, Junior high School also read Russell's "Mathematical Principles", and Russell Communication, Russell Love, invited him to the United Kingdom to follow their own learning logic. But Pitts home is bitter birth, even high school can not read, the British study abroad naturally failed. When he was 15 years old, his father forced him to drop out of work, like all the poor children who love reading, Pitts ran away from home. He asked Idol Russell to teach at the University of Chicago, went to Chicago alone, and actually met Russell, and he recommended him to Carnap, who was also teaching in Chicago. Carnap want to see how clever the child is, his "logical syntax of Language," a book to Pitts, not after one months, Pitts to see the end of the original book written full notes back to Carnap. Old card surprised to heaven, so he gave him at the University of Chicago arranged a cleaning work. Do not look at the cleaning, the film "Soul Catcher" (hunting) Rimat Damon played a role in the famous university cleaning, accidentally solved the math problem, attracted the teacher's attention. Sweep the road at least to avoid wandering the street. Pitts later met Mccarlocco in Chicago. Warren-McCullough, who is a freshman in Yale with philosophy and psychology, and has a master's degree in psychology and a Doctor of Medicine (MD) in Columbia, is not the same as a doctor of medicine and a Ph. D., MD is not an academic degree, it is an ultimate professional degree, and an MBA, MFA is The D in MD means "Doctor" and PhD D is the doctor. Mccarlocco a few years as an intern, went to Yale to study neurophysiology, then went to the University of Illinois at Chicago to be a professor of psychiatry. Mccarlocco's strong point is neuroscience, but he does not know math, and he and the 17-year-old Hobo Math fancier Pitts is perfect match. The result of their collaboration is the first article in the neural network: "A Logical calculus of Ideas immanent in nervous Activity", published in the Journal of Mathematical Biophysics. This article has also become one of the sources of cybernetics.

Cybernetics's originator, Norbert, who was a Harvard professor, took him to England to see Russell, but Rosselt disliked the child and his father. Ever since entering the 20th century, no one has ever been able to get to Russell, and the scientists who don't want the Nobel Prize for literature are not good lovers. Weiner later taught at Harvard, but was not liked by mainstream mathematicians and didn't get tenure. Finally, he landed next door at MIT and did some research on weapons during World War Ii. At the time, the best mathematicians and physicists were involved in the "Manhattan" project to build the atomic bomb, and Wiener was not. It may have something to do with his personality, and his colleagues and family feel he is unresponsive to things outside of math. Wiener put forward "cybernetics" after the name, at MIT made a lot of money, McCullough can take Pitts, such as a ticket to wiener, money to be the eldest, which is the same. Wiener's wife Margaret was a Nazi, and in World War II, the family also stole the English version of Hitler's "My Struggle." At that time their daughter Barbara was in primary school, consciously and unconsciously also read the book, writing the text of the "aphorism" was almost expelled from school. Mccarlocco's wife was Jewish, and Margaret was in shape. In fact, Weiner's ancestors were Polish Jews, what did Margaret do early? Did Wiener marry Margaret to laugh at herself? Just like a lot of Chinese men for foreign wives or foreigners to marry China, the picture is not looks, is rare. Anyway, the last wiener was neutralized as an "agnostic" (agnostic). Margaret once told Wiener that Mccarlocco's team (possibly alluding to Pitts) had seduced her baby daughter Barbara, who was furious, and immediately severed all dealings with Maclock and his students. Now look Margaret was deliberately disinformation. But Wiener's move to Pitts caused great trauma, Pitts was originally Wiener's Tate students (special student), but he was expected to be a setback when young, strange temperament. After falling out with Wiener, he turned down his graduate degree from MIT and was discouraged by learning. Pitts 1969 than his elder Mccarlocco died a few months earlier, only 46 years old.

There are not many people who get wiener to tell you, but Michael Abib (Michael arbib). He got his PhD at the age of 23, and he became famous as the science book Brain, Machine and math. Abib later founded a computer department at the University of Massachusetts, and extended a team of AI men, including Andy Barto, who later became known as "intensive learning", which led the university's AI to lead the Buttle. Abib later transferred to the University of Southern California, as a bunch of professors, including computers, biology, biomedical engineering, Electrical engineering, neuroscience, and psychology; his business card, if printed, is likely to be similar to Chinese peasant entrepreneurs, on the other "CPPCC members" or "deputies to the NPC". Abib to Southern California, there has been no impact on the original results. In the nervous network depression, Buttle's "adaptive Learning Laboratory" had a short time to take a lot of people, including later the big guy, such as Jordan (Michael Jordan), Jordan in Berkeley, and raised Andrew Ng and other finished men, that is something.

In the 1949, Neuropsychological Hebb published "Histology of Behavior" (from of Behavior), in which Hebb proposed a learning mechanism known as the "Hebb rule" by posterity. This rule holds that if two cells are always activated at the same time, they have some kind of association, and the higher the probability of activation, the higher the degree of correlation. In other words, "what to eat". The Hebb rule was also confirmed by animal experiments at the 2000 Nobel Prize laureate, Eric Kandel. Later, all kinds of unsupervised machine learning algorithms are more or less a variant of the Hebb rules.

The latter major breakthrough in Neural network research is 1957 years. Frank Rosenbratt, an experimental psychologist at Cornell University, simulated a neural network model he invented called a "perceptual Machine" (Perceptron) on a IBM-704 computer. This model can accomplish some simple visual processing tasks. This caused a sensation. Rosenblatt theory proves that single-layer neural networks can converge when dealing with linear and identifiable pattern recognition problems, and do some experiments of "perceptual machines" with learning ability. Rosenblatt the book in 1962: The Theory of neural dynamics: perception machines and brain mechanisms (principles of neurodynamics:perceptrons and the germ of Brain mechanisms), The book sums up all of his research and becomes the Bible for what to eat. Rosenblatt's reputation is growing, and more research is being funded. Both the Ministry of Defence and the Navy have funded his research work. The media has also shown undue concern over Rosenblatt. After all, the ability to build a machine that simulates the brain is, of course, a headline-grabbing piece of news. At this time of Rosenblatt also change the past shy, often in the media out of the mirror, he drives a sports car, playing the piano, showing off everywhere. This makes the other people quite unhappy.

Minsky was one of the founders of artificial intelligence and was the organizer of the Dartmouth Conference. Minsky quarreled with Rosenblatt at a meeting, arguing that neural networks could not solve the problem of artificial intelligence. Subsequently, Minsky and another MIT professor, Peput, collaborated in an attempt to prove their point theoretically. The result of their cooperation is the book that has a huge impact, "Yes and no": "Perception Machine: Computational Geometry" (Perceptrons:an Introduction to computational Geometry). In the book, Minsky and Peput prove that a single layer of neural networks cannot solve XOR (exclusive OR) problems. Different or a basic logic problem, if this problem can not be solved, the computational power of the neural network is limited. In fact, Rosenblatt has also guessed that "perceptron" may have limitations, especially in the area of "symbolic processing", and that, in his neuropsychological experience, some people who are injured in the brain cannot handle symbols. But the "perception machine" flaw was presented in a hostile manner by Minsky, which was fatally hit by Rosenblatt. All the original government-funded agencies have gradually stopped studying neural networks. Rosenblatt was drowned in a boating boat on his 43 birthday in 1971. Many people think he killed himself. Wang Guowei chenhu words "by this world change, righteousness no more disgrace", in Rosenblatt, I guess "disgrace" is Minsky's book, "World Change" is the subsequent "neural network" discipline depression. The difference is that Wang "world Change" is the historical trend, but the neural network subject will be reversed in ten years.

The surface is scientific, but there is evidence that Minsky and Rosenblatt had been involved before. They are middle school students. Bronx (Bronx) Science High School is probably the best high school in the world, with eight Nobel Prizes and six Pulitzer Prizes in the graduates. Far from saying that Minsky was a 1944 graduate, Chomsky was a 1945 graduate, and Rosenblatt was a 1946 graduate. With four years of schooling in the United States, Minsky and Rosenblatt have at least two years of overlap, and they know each other and envy each other. The 1956 Dartmouth Conference defined the term "artificial intelligence", where the organizers included Minsky, McCarthy and Shannon, as well as the participants including Sima, New Will, etc. The conference was only referring to neural networks in defining the field of "Artificial intelligence". Minsky was a supporter of the neural network. His doctoral dissertation in Princeton in 1954 was titled "The Theory of neural-analogue hardening systems and their application to brain model problems" (germ of Neural-analog reinforcement Bae and its creator to The Brain-model Problem) is actually a treatise on neural networks. In his later years in an interview, he joked that the 300-page doctoral thesis had never been officially published, probably only three copies, and he couldn't remember the contents. He seems to be trying to exonerate himself from the intricate relationship with the neural network discipline. The theme of the Dartmouth conference was not a neural network, but something that was later dubbed "the system of physical symbols" by Newell and Sima, which means that the Dartmouth conference, "What to think" is the main tone.

Rosenblatt was one year older than Minsky's jealousy was natural. At work, Minsky's artificial Intelligence Laboratory at MIT is also applying for funding from the Ministry of Defence and the Navy. Most of the circle's scientists are disgusted with Rosenblatt's sudden star-shaped fashion. Minsky's early days were "what to eat and what" to send, but this time has been changed to "What to do" pie. Because of his and Peput's critique of the perception machine, they were later "the Devil's partner" by "What to eat" pie. In fact, Minsky know Peput met or through the introduction of McCulloch, history is really tangled. The "demon" is called "the Devil" because the first edition of the Perceptron says: "Most of Rosenblatt's papers have no scientific value." "This is a bit big jump, but Rosenblatt popularity is not good, no peer support."

Videro (Widrow), a Birosenbrat one-Year-old professor at Stanford University, proposed a Adaline adaptive algorithm when Rosenblatt first presented a "perceptual machine". Adaline is similar to the Perceptron and is one of the originator models of machine learning. Rosenblatt enjoy the reputation, Videro also stained with light, but after Rosenblatt died, he was not criticized. Videro in a few decades, it is because he later mainly in the electrical system (EE) to do integrated circuit work, rather than in the computer department engaged in factional complex artificial intelligence research, different circles, ways.

The failure of the perceptron led to a decline in neural network research, with the California Institute of Technology's integrated circuit, Carver Mead, as saying, "20 years of famine." In the second edition of the Perceptron, Minsky deleted the original sentence of Rosenblatt's personal attacks and handwritten the words "Memorial Rosenblatt" (in Memory's Frank Rosenblatt). But other scientists who had been oppressed during the "Great Famine" thought Minsky was unforgivable, and after the neural network had gained momentum, they had been vocal about Minsky. The American Institute of Electrical and Electronic Engineers (IEEE) established the Rosenblatt Award in 2004 to reward outstanding research in the field of neural networks.

The failure of the combination of information science and neuroscience did not affect the internal neurobiology. Harvard biologist Huber (Hubel) and Wicher (Wiesel) studied the information processing patterns of nerve cells in the retina and visual cortex, which they won the 1981 Nobel Prize for Medicine. Subsequently, MIT's untimely demise of David Mars (Marr) created a mathematical model for visual information processing that influenced the movement of later-joining. Wicher later left Harvard for Rockefeller University. After Laureate's academic scandal was forced to resign at Rockefeller University in 1991, Wicher became president of Rockefeller, contributing to the establishment of the school's biological powerhouse.

A doctoral dissertation in Harvard in 1974 proved that an XOR problem could be solved by adding a layer to the neural network and using the Back-propagation learning method. The author of this paper is Vaupos (Werbos), who was later awarded the Pioneer Award of IEEE Neural Network Society. Vaupos This article just published did not cause much attention, at that time it is the trough of neural network research, the article is outdated.

The revival of neural networks in the 1980 was credited to physicist Hopfield (Hopfield). In the 1982, Hopfield, a professor of biophysics at Caltech, proposed a new neural network that could solve a large class of pattern recognition problems and give an approximate solution to a class of combinatorial optimization problems. This neural network model is called the Hopfield network. In the 1984, Hopfield realized its own model with analog integrated circuits. Hodgson has also raised a number of Up-and-comer, including Terry Sejnowski, now the director of the Computational Neurobiology Laboratory at the Salk Institute of Biological Towns. After he was transferred to Princeton as a professor of molecular biology, he is now retired. The Hopfield model was raised to boost the neural network field. The survivors of a gang of early neural networks, encouraged by biologist Crick (Crick, the Nobel laureate who invented the double helix of DNA) and Tang Norman of Cognitive Science (Don Norman), started "connectivity" based on the University of California, San Diego. (connectionism) movement, the leader of the movement is two psychologists Rummelhat (Rumelhart) and Maclilande (McLelland), plus a computer scientist Hinton (Geoffrey Hinton).

One of the results of the Parallel movement was the famous anthology known as The PDP (distributed 處理). The book's publication has blown up the wind in cognitive science and computer science, and has been the Bible for the newest neural network rookie. "Neural Network" in the 80 's like the 90 's Internet, later Web2.0, and now "big Data". Everyone wants a condom. Some of the theory of the big guy can not immune, the invention of the RSA algorithm R (Rivest) also brought several students to do neural network learning problems of complexity. A red flag does not fall, banners fluttering, not lively. In 1993, IEEE began publishing the Journal of Neural Network, which provides a publishing channel for high-quality articles in this field. The U.S. Department of Defense and the Navy and Energy Department have also increased funding. The neural network suddenly became a famous.

The link movement has also fostered a pile of newcomers, and has made the University of California's Cognitive sciences the leader in similar departments. Rummelhat, who was transferred to Stanford University, died last year after suffering from neurodegenerative diseases that had been struggling for years. Jordan is his student, and Andrew Ng is Jordan's students, Rummelhat people died, but the incense is not extinguished. His other student, Robert Glushko, later moved away from the bank, following the early tanan of Silicon Valley's internet hero, Baum, to create an XML company that later sold commerce one to make a vote. Glushko donated money to set up the "Rummelhat Award" to reward the neural network researchers, Hinton became the first winner. Maclilande first went to Carnegie Mellon as a professor of computer and psychology Two, and later to Stanford, where he established the Center for Heart, Brain, and computational research and was once head of the department of Psychology.

Hinton first to Carnegie Mellon and eventually to the computer department at the University of Toronto in Canada. Hinton is now the most awesome person in the neural network. He also had a revolutionary family history that was not known to outsiders: he was the Zengzen of Boolean (yes, the Boolean algebra), and his great-grandmother Ellen was the daughter of Boer. Chinese revolution participants, American hardcore left handing and Cold Spring (William and Joan Hinton) is also Ellen's grandson, according to this said handing is Hinton's cousin, cold early Hinton's Don. Boer's little daughter, Ellen's sister, Nitch (Ethel Lilian Voynich) was the author of the novel Gadfly, which spread through the Soviet Union and China. "Gadfly" The West does not bright oriental bright, in the Soviet Union and China are generations of revolutionary plus love inspirational bestseller. In his later years in New York life into a dilemma of the FU, relying on the Soviet Union and Zhou Enlai's approval of China's accidental royalties to a dead ending. The family linked China, the Soviet Union, the Revolution, the logic and the neural network, taking all the "What to eat" pie and "what to think about" pie.

Steve Pink, a linguist and a public intellectual, disagree with the link doctrine. Rummelhat and Maclilande in the PDP Bible a chapter, said that neural network can learn the past tense verbs, such as a look at start, know started, a look at come know came and so on. Pinker that the rules of the past (directly plus Ed, such as started) can be calculated by simple calculations, while irregular (not by adding Ed, such as came) is a specific area of the brain. Pinker cited neuropsychological evidence that the rules and irregular operations were performed in different parts of the brain, and that the neural network behavior was similar to that of a patient with a brain-damaged aphasia. In fact, this observation is not profound, are Rosenblatt 30 years ago to play the rest. Symbolic systems may be more suitable for dealing with rules, and neural networks may be more suitable for irregular situations, a common person can think of. The same is true of neural network criticism: we can define a rule that can be implemented using a symbolic system or a neural network. Which one to use quickly.

The debate over symbolic processing and the methodology of neural networks is sometimes exaggerated. The Great Chomsky does not recognise the latest developments in the field of artificial intelligence. Machine translation has always been one of the touchstone of artificial intelligence, like playing chess on a computer before 1996. The early practice of machine translation originated from Chomsky's theory, but recent breakthroughs are based on statistical methods. Chomsky argues that the statistical approach is not "elegant" (elegant), but rather imitation than understanding. Can ride a bicycle does not calculate to understand, to the bicycle why not fall, can make a pointing, just calculate understanding. "Simple models (such as Chomsky's theory, and later versions of improvements) cannot solve complex problems, and the further development of artificial intelligence must be on both legs," Peter Novig, Google's research and development director, defended the statistical approach. Novig, a computer professor at the University of California, Berkeley, before joining Google, who knows both parties well, is respected both in academia and in industry, and he writes "artificial intelligence" as the most popular textbook. His views seem to be accepted by more people.

The light of the neural network in the 80 's was obscured by later internet. But these past few years are precisely the internet has given a greater chance of neural network. The hottest word in computer science these years is "deep learning". A neural network is composed of a layer of neurons. The more layers, the deeper the so-called deep learning is to use a number of layers of neurons to form a neural network to achieve machine learning function. Hinton is the originator of "deep Learning", a 2006-year article that opened up this new field. Each node of the last two layers of the latest depth neural network can correspond to some concepts. This is a great progress of neural network, seemingly for "what to eat" found a scientific basis, reconciled with the "symbolic faction" contradictions. It's another thing to buy a sign without buying it. The measured effect of depth learning is very good. Hinton was first used for image recognition, and later Microsoft developed a practical speech recognition and simultaneous translation system with in-depth learning.

Kensington Lonely, 60, and his two students started a firm that focused on deep learning. The company was founded not long, Google and Microsoft on the company to move the idea of the acquisition, and then Baidu also joined the bid, the final flower fell Google, Google out of the tens of millions of U.S. dollars in early 2013, the acquisition of the company has only three employees. In order to put Kensington into the roster, Google is really not bad money.

In 2012, Andrew Ng, director of the Artificial Intelligence Laboratory at Stanford University, collaborated with Google to build the largest neural network of its time, a project at Google's mysterious X-Lab. Google's cat face recognition, once crazy on the Internet, is a neural network with up to 1.7 billion of the parameters. Later, Ng himself made a bigger neural network at Stanford, with a whopping 11.2 billion parameters. There are 1000000 trillion neural connections in the brain. In terms of computational power, if the artificial neural network can approach the brain, each artificial neuron must be capable of reaching 10,000 neurons in the brain. The neural network uses a large number of graphics processing chips GPU,GPU is the perfect hardware to simulate neural networks, because each GPU chip has a large number of small cores. This is naturally similar to the large-scale parallelism of neural networks. Advances in hardware have made it impossible for the past to be possible.

McCarthy, founder of the Stanford Artificial Intelligence Laboratory, is the chief organizer of the Dartmouth Conference, the word "artificial intelligence" that he proposed, and he took Minsky to MIT, where he taught. To say that he is the father of artificial intelligence is worthy of the word, Uncle John is a hardcore symbol. But the current director of the AI Lab is Montana Andrew Ng, a neural network. This transformation may be a "eat what" pie to the wind vane. The goal of this neural network at Stanford is to simulate the human brain. It makes us think of Rosenblatt, isn't that his dream?

This article was inspired by the old friend's billows and thanked him. Every time he chats with him, it's a benefit.

(Responsible editor: Mengyishan)

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.