Big data triggers big changes in the media industry

Source: Internet
Author: User
Keywords Large data we these trigger

In the 2002, there was a science fiction film, the Minority Report, about the United States in 2054, where murder had disappeared and crime could have been predicted. There are three people with the ability to perceive the future--the prophet, who can get information about the crime beforehand, and after the Justice Department's crime Prevention team has deciphered the evidence, the offender will be punished before committing the crime. And it all seems to be going into reality, but the movie is using super Power, now using big data.

In the first few weeks of the http://www.aliyun.com/zixun/aggregation/37728.html "> H1N1", in 2009, Google predicted the spread of influenza by observing people's online search records. Google has kept all of its search records for years, receiving more than 3 billion search instructions a day from around the world, and has come to this conclusion by analyzing these huge data sources.

In May 2011, McKinsey, the world renowned consultancy, released a report on Big data: The next frontier for innovation, competition and productivity, which kicked off a big data study. As the first thematic study to interpret the potential of large data development from economic and business dimensions, the report system expounds the concept of large data, enumerates the key technologies of large data, analyzes the application of large data in different industries, and clearly puts forward the strategies for government and enterprise decision-makers to deal with the development of large data.

Introduction to large data

McKinsey argues that "big data" refers to a dataset whose size is beyond the capabilities of typical database software acquisition, storage, management, and analysis. The definition has two connotations: one is that the data sets that conform to large data standards are variable in size and will grow with time and technology; the second is that different departments meet the large data standard data set size will differ. At present, the general range of large data is from several TB to several PB.

Big data, how big the data is. Many people try to measure an exact number. Martin Hilbert of the University of Southern California tried to draw the exact number of all the information that human beings created, stored and disseminated. According to his estimate, in 2007, humans stored about 300 gigabytes of data [1]. By 2013, he predicts, data stored in the world could reach about 1.2 kilobytes. This means: If you put all this data in the book, you can cover the entire United States 52 times, if stored on a CD-ROM, these discs can be stacked into five piles, each can extend to the Moon [2].

IBM summed up the characteristics of large data to 4 V:

(i) Volume

The data volume is huge, when we need to process the data, no longer need random samples, but the whole data. That is the sample = all. In the small data age random sampling, we use the least data to obtain the most information. Because of the limitations of technology, getting too many samples consumes a lot of cost and effort. Now that the technological environment has improved a lot, sampling analysis is still similar to riding in the car age. Big data is destined to have the greatest impact on the social sciences, as we no longer rely on sample surveys.

Two) Produced

The types and sources of data are numerous and contain more and more unstructured data (such as images, sounds, etc.); there are several main sources of large data [3]:

First, the media data, especially the Internet, social media generated data, including people browsing the digital records of the web;

The second is the production, sales, management and so on of various enterprises data;

Third, the data of government departments;

Four is the internet of things, the data produced by various sensors, as well as the data of various cameras taken by the Internet;

The data, including personal, family text and audio-visual data, are retained by the people.

Data types and the number of many, a variety of data mixed, the direct result is the increase in error data, precision decline. In the small data age, we are responsible for the reliability and validity of our investigation in a sample survey, so the more accurate the data, the better. But this is not the case now, in the large data age, we have more and more comprehensive data, it not only contains a little bit of the phenomenon of data, but also include a large number of these phenomena and even all the data. What we need to do is to accept and benefit from the plethora of data, rather than eliminate all uncertainties at a high cost.

Three) Velocity

Data growth rate is fast, and its demand for real-time processing speed is also very high.

Four) Value

The data value is high but the density is low, the data that has the value behind the massive data is low.

123 Next
Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.