How do large data change traditional disaster prevention and mitigation models?

Source: Internet
Author: User
Keywords Large data disaster relief disaster prevention
Tags big data change data high information information age it is key

The 1976 Tangshan earthquake, the 2008 Wenchuan earthquake, the 2010 Yushu earthquake made people remember. In the face of earthquakes, floods, high temperatures, torrential rains and other natural disasters, how could people sit idle? Timely and effective disaster response measures are essential, and the amount of data that is obtained is the key to determining the effectiveness of relief measures. Data as the basis of the information age, its important role has been self-evident.

As Chenhua, director of the Institute of Modernization strategy of Guangdong Provincial Party school, said before tomorrow: "We must pay attention to large data, change the traditional disaster prevention and mitigation model, including people's cognitive change, thinking change, work change, lifestyle changes." ”

"When a major unconventional event occurs, the large data for analysis, judgment and prediction of this important event determine the scientificity, efficiency and rationality of the final decision." When this data is converted into knowledge, it will play a significant role in predicting significant unconventional events. "said Wensheng, Deputy chief engineer of the Institute of Automation, CAs. The characteristics of large data, such as multi-source, large quantity and real time, can help the government to predict the occurrence and development of disasters, decide the priority of disaster relief, and play a more and more important role in disaster prevention and mitigation.

When the flood of data, how to use and master these data, for disaster prevention and mitigation work to contribute is an urgent problem to be solved. Wensheng that, if the data from the sheer volume of large data as a huge amount of data, it is too one-sided. Large data is the inevitable outcome of social development and technological progress, large data volume and complex structure, real-time strong, the processing principle, technical means in inheriting the original data mining, machine learning, based on the research results will have the essence of change, the application model and influence is different from the past, which marks the data as an important resource into a new era. Through the analysis of these numerous data, let the data for us to use, this is the core of large data.

"Large data can improve the accuracy of disaster prediction early warning." "Jia Huaming said. 4 hours after the Yaan earthquake in Sichuan province, the outside world has been able to learn from various media of the valuable aerial remote sensing images screened by large data, and to analyze the disaster situation and formulate rescue plans.

In the face of disaster, disaster prevention and mitigation situation is urgent, the timeliness of early warning information is extremely important. In the 2011, the people of New York City in the United States is the "Twitter" in time to understand the surrounding areas of earthquakes to reduce losses.

Jia Huaming said, "The Wenchuan earthquake in the early days of disaster relief, because the disaster relief information is not timely, incomplete, inaccurate, there have been relief congestion, relief supplies are not transported in the situation." Large data-driven disaster information transmission is very high, it is necessary for reasonable arrangement of disaster prevention and mitigation personnel, planning action. "The massive data information spreads on the network platform, the upper and lower echoes, the interrelated, the people coordinated through the disaster relief congestion link."

The openness of information, the spread of all kinds of valuable information, everyone is likely to become a disaster prevention and mitigation of disaster in a member. "This will effectively promote the efficient use of limited resources, avoid secondary losses and waste of resources." "Jia Huaming said. In addition, large data plays an increasingly important role in the effectiveness of disaster loss statistics, which can avoid the phenomenon that the information collected is not unblocked and the computational capacity of repeated computation is wasted.

The era of information explosion has come. Wensheng said: "The key problem now facing is difficult to get large data, there is an embarrassing phenomenon: see Big data, catch the big data, grasp the big data, and can not handle large data." ”

(Source: "China Meteorological News" June 17, 2014 Three edition of the Executive Editor: Tang Yu)

(Responsible editor: Mengyishan)

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.