Have you ever thought about this scenario: one day, when you turn on the TV to watch the weather forecast, you find that it can accurately predict the weather at your doorstep 8 o'clock in the morning tomorrow.
Or one day you want to buy a house, you no longer have to suffer from the suffering of the house, just open the computer, in a digital map typing a few keywords, you can see the surrounding scenery of the house, the internal structure of the House, and even can experience their own exposure to the balcony can see the scenery.
Even when you walk into the cinema, see the Hollywood sci-fi blockbuster "Inception", you can also imagine the movie, in the dream of the easy simulation of a real space, its fidelity can deceive those who are "turned" to the dream of the people, so that they mistakenly think that in reality ...
You might say it's only in the "Inception" space. Indeed, in real life, if you want to simulate a real space, especially to achieve the true degree, the amount of calculation and rendering of the project, not to mention the human brain, a common computer can be blown up.
But with the advent of the big data age, it would not be a dream to use a supercomputer with the "strongest brain".
1. Mass Computing
If the average computer is as fast as an adult, the supercomputer is the rocket speed. At very high computational speed, it is possible to predict and explain the natural phenomena which could not be experimentally studied by numerical simulation.
For many people, computers have become an inseparable part of life. Do you think the four PC in your home is already very powerful? In the National Supercomputer Center of Sun Yat-sen University in Guangzhou, there is a supercomputer with 3.12 million cores and floating-point speed of 33.86 petajoules per second, which is the Tianhe second developed by the National Defense Science and Technology University.
Into the Guangzhou Super Center, the reporter saw the world's fastest super computer Tianhe No. second. A long row of black cabinets were neatly placed in laboratories as large as three basketball courts. Although each row of cabinets separate, but the staff told us that they work is connected to each other, is actually a computer.
Fast calculation, large storage volume, the volume is also very large, this is a reporter to see the super computer when the most intuitive feeling.
In order to better understand the "big guy" supercomputer, we might as well turn the clock back to one.
February 14, 1946, this is an epoch-making day in human history. The world's first computer "Eniac" was born in a laboratory at the University of Pennsylvania. "Eniac" is really a monster, which consists of 17,468 tubes, 60,000 resistors, 10,000 capacitors and 6,000 switches, weighing up to 30 tons, covers an area of 160 square meters, power consumption 174-kilowatt/, cost 450,000 dollars, can run 5,000 times per second addition operations.
In the following more than 60 years, the technology which is represented by electronic computer is widely used in society and people's life. However, when the human more and more deep in the High-tech field, encountered is more massive, beyond the general computer computing power problems.
At this point, the supercomputer appeared in due course. A supercomputer (supercomputer), usually a computer consisting of hundreds of thousands of or more processors (machines) capable of performing large data-processing high-speed operations that are not normally handled by PCs and servers. If the average computer is run faster than an adult, the supercomputer will be at the speed of the rocket. At very high computational speed, it is possible to predict and explain the natural phenomena which could not be experimentally studied by numerical simulation. Take Tianhe second, 1.3 billion people with a calculator 1000 years to reach the Tianhe second 1 hours of calculation.
Since the basic components of a supercomputer are not much different from the personal computer concept, many people mistakenly assume that supercomputers are simply CPU stacks. But this is an understanding misunderstanding.
Wave group high efficiency server and storage technology, deputy director of the State Key laboratory Hureyun The analogy: you use 1000 PCs, with a common network cable to connect them together, you may have spent 10,000 degrees of electricity, 10 days, but you use a real supercomputer, probably only 5000 degrees of electricity, spent two days even out.
All the work that supercomputers do is to improve performance and efficiency. In an era of information explosion and scientific progress, the existence of supercomputers is essential in order to push one field forward. Today, supercomputers are also a sign of a country's scientific and technological prowess, as well as High-tech fields such as weather forecasts, genetic engineering, the nuclear industry, military, astrophysics simulations and aerospace.
2. Super Application
With the development of supercomputers, most of the areas it extends are closely related to people's livelihood, such as weather forecasts and climate simulations, earthquake prediction, three-dimensional maps and large data applications.
As "the brain of modern science and technology", super computer is widely used in the frontier field of "high, fine and sharp" such as Earth climate simulation, cosmic celestial body Research, gene research, oil exploration, natural disaster forecast, etc.
And the coming large data era, but also to the human data control ability to put forward a new challenge, the use of supercomputers to solve the big data era of major problems imminent.
The development of China's advanced oil exploration in the world has undergone a period of hardship. Changhai, chief engineer at the Research and Development center of the Eastern Geophysical Company of CNPC, said that "the so-called heaven is easy to get into," says the oil exploration. ”
In the Qaidam Basin of Qinghai Province, the geological conditions in the heroic ridge area at an altitude of 4000 m are bad, and the seismic geological conditions make the seismic prospecting Kyu.
Changhai said: "Oil exploration, we do not see oil, the most direct way is drilling, but drilling costs too high, to rely on artificial seismic waves to detect, and then data analysis." This calculation, whether it is hardware or software, the load is very large. ”
The so-called artificial seismic wave detection, also known as seismic prospecting, is to send a seismic wave to the ground, after the earthquake wave launched the earth has a receiver, after receiving these seismic waves to analyze, so as to determine the location of oil. But to receive and analyze these seismic waves, not only has the ability to obtain a large amount of data instantaneously, but also involves many complex computations.
Finally, with the development of super computer, it is possible to calculate the large amount of seismic data of petroleum exploration.
In addition to oil exploration, supercomputers also have a wide range of applications in intelligent cities, personalized medical care and astrophysics. Experts have boldly predicted that subcritical nuclear tests can be implemented in laboratories with the powerful and fast computing power of supercomputers, which means that supercomputing can completely replace nuclear testing.
Still, since the early supercomputers were heavily used in national-level research projects, the public was unfamiliar with supercomputers, creating an impression that a supercomputer had little to do with its own life. But in fact, with the development of supercomputers, most of the areas it extends are closely related to people's livelihood.
Data show that the Tianhe for automotive equipment, oil geophysical prospecting, animation rendering, bio-medicine and other related enterprises to bring billions of benefits, radiation regions and industry economies of the scale of nearly tens of billions of dollars.
For the weather forecast, people are concerned about the weather for the next day, but why does the weather forecast need a supercomputer?
Meteorological Bureau experts introduced that the current medium and short term weather forecasts are mainly based on meteorological satellite observations of the atmospheric data, through the solution to describe the process of weather evolution of the dynamics of the equations to achieve. This kind of operation involves the huge amount of data, the computational process is complex, the general computer to calculate the weather conditions of the next day may take several months, this forecast lost significance.
In other words, supercomputers can help us control the weather in our vicinity in real time, no longer lamenting "the June Day, the child's face" or "the rain in the East Sunrise West".
3. Super algorithm
"Supercomputers are like an abacus, and if there is no formula, it is useless." "For the supercomputer to really work, it requires a variety of large, complex formulas and algorithms, called the Super Algorithm theory.
China is the home of the abacus. With the help of many of the computational formulas left by our ancestors, even today, even in the age of the electronic computer, the subtraction is faster than the calculator, with an abacus of less than 10.
"Supercomputers are like an abacus, and if there is no formula, it is useless." "Ahim, professor of Mathematics and computational science at Sun Yat-sen University and Director of key Laboratory of Computational science, Guangdong province, said. For supercomputers, it takes a variety of formulas to make it work, but these are bigger and more complex. At present, the scientific community calls this algorithm the super algorithm theory.
Ahim introduction, Super Algorithm theory is to study how to design high-performance parallel computing system, how to play the performance of supercomputers for Supercomputing Science services.
While supercomputers are running faster, they are also facing bottlenecks: they are growing in size, generating more heat and increasing power consumption.
"If it's just faster, but the power is higher and the cost is higher, it loses its relevance." "Wave group High-performance Computing general manager Liu June said, now the research direction of supercomputers is to continuously improve the performance ratio, in the rated power consumption as much as possible to improve the speed of operation."
At this point, the significance of the algorithm is self-evident. In layman's terms, the significance of the algorithm is to find more efficient computing methods to maximize the full use of supercomputer computing capacity, which is the physical, mental and basic knowledge of people test.
A news that hit the world at the beginning of the 2010 just can verify the power of the algorithm. A French programmer broke the world record for the 42nd-place t2k Open supercomputer, using an ordinary desktop computer worth 2000 euros.
Therefore, compared to the rapid development of supercomputer hardware, the world is more focused on the research of the field of computing.
Liu June introduced, although China has the world's fastest super computer Tianhe second, but our country in the field of application is very weak, more than 90% of the software are dependent on foreign imports. This also leads to China, although has a very strong performance of large-scale computing systems, but lack of matching large-scale parallel software, many large supercomputers have to split into small clusters to run applications, large systems do not play its due value, "This is a very big waste."
"At present, our country in the application software can run to more than a trillion times is not very much, that is, the utilization of supercomputers is not high." The use of supercomputers should be very cautious, and use bad is to burn money. Director, National Key Laboratory for efficient server and storage technology Wave group executive President Wang Endong also told Nanfang Daily reporter, how to use the best is a global problem, in order to make the supercomputer really become the promotion of scientific and technological innovation and socio-economic development of the "engine" must be from the application of innovation and talent training to start, Give full play to the computing potential of the hardware.
The world's top 500 supercomputers have half the U.S.
From 1983 our country first was named "Milky Way" billion giant electronic computer was born, to 2013 Tianhe second to peak calculation speed 549 million times per second win again, can be said that the Chinese people with 30 years to achieve a big leap in overtaking.
However, despite the success of the Tianhe second breakthrough, but in the overall strength of China and the super computer, the first big country compared to the United States there is no small gap.
According to the January 2014 statistics of the People's Daily, the number of supercomputers in the United States, with 253 of the world's top 500 supercomputers, is greater than the sum of other countries and regions. China has a total of 65 supercomputers entering the top 500 list, ranked second. Japan ranked third in 30. Britain, France and Germany were ranked fourth to sixth with 29, 23 and 19 respectively.