Financial Times: Weather and GDP the limitations of large data processing
Source: Internet
Author: User
KeywordsLarge data large data for large data for Financial Times big data for Financial Times limitations big data for Financial Times limitations they
Overseas Network February 24 electricity
The FT's February 24 article, entitled "Limitations of large data processing", reads:
It was raining in southern France in Monday.
But the day before was sunny.
It was dry in Wednesday, but it was almost always raining from Thursday to Saturday.
A little consolation is that the time interval between storms and sunny days has been accurately predicted. That's why I wrote this column in Monday, not the day before. The accuracy of the weather forecast has greatly improved now.
The BBC has once again released its worst weather forecast in its history. In 1987, Michael Fish (Michael Fish) assured viewers on television that the rumours of a hurricane were unfounded. However, a few hours later, the worst winds of decades swept across the UK, overturning roofs and blowing down many trees.
However, the possibility of such a black dragon is much smaller now. Short-term weather forecasts are a huge achievement in the field of large data-perhaps the biggest achievement. Supercomputers provide the opportunity for large data processing, and the size and complexity of the collections they handle are unbelievable. As far as I know, the latest supercomputer can handle the data of 1EB (AI-byte, 2-60-byte-translator), about 20 million times times the processing power of Apple's Mac. The Airways meteorological Office claims that the three-day weather forecast is as good as the one-day forecast, compared to the most successful forecasts of the fish era-but it may not be the most convincing way to describe the extent to which the forecasting capability has improved.
However, the fact remains that the more advanced the forecast time is, the more accurate the forecast will fall. Forecasters can provide us with accurate forecasts for the next two days. For longer periods, they still cannot be accurately predicted. There is a stark contrast between the two. For example, abnormal weather conditions are expected in the winter this year.
It is possible to forecast short-term weather conditions. This is because, in a sense, most of the factors that determine the weather of tomorrow have emerged. If you go to the YouTube site and look at Fish's disastrous forecast, you'll see the ultra-low-pressure area that triggered the 1987 hurricane in his picture. The forecasters at the time were only wrong when they analyzed the existing data. The likelihood of this error is reduced as long as the analytical capacity is improved. However, if you anticipate more time ahead, you will encounter an intractable problem: in a nonlinear system, if the initial condition changes slightly, the longer the time, the greater the change in the result. In this case, it is one thing to know nothing about the initial situation.
This is true to a large extent in the economic and commercial fields. Like tomorrow's Rain or the 1987 hurricane, the answer to tomorrow's gross domestic product will be more or less there: tomorrow's products are in production, tomorrow's goods are on the shelves, tomorrow's business has been arranged properly. Large data processing will help us to analyze such information. With large data processing, we will know more accurately and quickly how much GDP is, we will be able to more successfully predict the next quarter of output, our forecast needs to adjust the number of times will be less.
With the help of large data processing, hedge fund managers will be able to accurately predict what data they will release before the National Bureau of Statistics (STATISTICS) itself knows the numbers. Achieving this goal can bring them great profitability, but it is of little use to society. Large data processing gives them a very comprehensive message that is no less than the information held by the Bank of England's Monetary Policy Committee (MPC) when it adjusts interest rates. However, large data processing does not help them understand what the monetary policy Committee is going to decide. Nor is it possible to help them understand how US Treasury Secretary Hank Hank Paulson and Dickfourd, chief executive of Lehman Brothers, will respond to the bank's impending bankruptcy.
Big Data helps us understand the past and the present. However, how much it helps us to understand the future depends on the extent to which the future is included in the present by some kind of relevance. This correlation requires that the operating mechanism behind the events be constant. For some physical processes, this principle is established. For a world that includes Hitler (Hitler) and Napoleon (Napoleon), Henry Ford and Steve Jobs, this principle never holds true. In this world, the processes that make major decisions or discoveries are inherently unpredictable and cannot be quantified.
In this world, one less nail can lose a war, and the nuances of the problem description can lead to very different results. For such a world, the above principles are also not tenable. However, with the help of large data processing, I know that the sun will shine again tomorrow.
The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion;
products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the
content of the page makes you feel confusing, please write us an email, we will handle the problem
within 5 days after receiving your email.
If you find any instances of plagiarism from the community, please send an email to:
info-contact@alibabacloud.com
and provide relevant evidence. A staff member will contact you within 5 working days.