The "Big Data" era to attack the CIO can you hold live?

Source: Internet
Author: User
Keywords Big data massive data three think

According to IDC, a market research firm, the total amount of digital information expected to increase by 44 times times between 2009 and 2020 will be estimated in the next 10 years, with global data usage reaching about 35.2ZB (1ZB = 1 billion TB). At the same time, the file size of a single dataset will increase, resulting in greater processing power requirements to analyze and understand these datasets.

A survey of 531 independent Oracle users by Unisphere Research found that 90% of businesses are rapidly rising, with 16% per cent growing at 50% or higher annually. Many companies have felt the impact of runaway data growth on performance, and found that 87% of respondents blamed the company's application performance problems on growing volumes of data.

Why are people so interested in big data? Large data is a groundbreaking economic and technical force that introduces new infrastructure to IT support. Large data solutions eliminate traditional computing and storage limitations. With growing private and public data, an epoch-making new business model is emerging that promises to bring new substantial revenue growth and competitive advantages to large data customers.

The great data of the Huashan sword

Although "Big Data" can be translated into large or massive data, there is a difference between large data and massive data. Informatica, chief product advisor in China but Bin said: "Big Data" contains the meaning of "massive data", and in the content beyond the mass of data, in short, "Big Data" is "mass data" + Complex type of data. Large data sets, including transactions and interactive datasets, are larger or more complex than the ability of common technologies to capture, manage, and process these datasets at reasonable cost and time. The large data is composed of three major technology trends: massive transaction data, massive interactive data, mass data processing.

"There is no uniform definition of big data," said Ye Chenghui, senior global vice president and Greater China president, in an interview. It is generally believed that it is a mass of unstructured data, characterized by a large number of data, the form of data diversification.

NETAPP Greater China General Manager Chen understands large data including a, B, c three elements: large analysis (analytic), high-bandwidth (bandwidth) and large content.

IBM uses three "V" as the basis for large data, as long as it satisfies two of the Big Data: diversity (produced), Volume (volume), and velocity. Diversity means that data should contain both structured and unstructured data. Volume refers to the amount of data that is aggregated together for analysis to be very large. Speed, however, means that data processing must be fast.

For large companies, the rise in large data is partly because computing power is available at lower cost, and systems are now capable of multitasking. Second, the cost of memory is also plummeting, and businesses can handle more data in memory than ever before. And it's getting easier to aggregate computers into server clusters. Carl Olofson, IDC's database management analyst, believes the combination of these three factors has spawned big data.

Olofson says big data "is not always said to have hundreds of TB." Depending on the actual usage, sometimes hundreds of gigabytes of data can also be called large data, which depends mainly on its third dimension, i.e., speed or time dimension. If I can analyze 300GB of data in 1 seconds, and usually it takes 1 hours, the results of this huge change will add great value. Large data technology is an affordable application that achieves at least two of these three criteria. "

Big data means making things different by getting information more quickly, and thus achieving breakthroughs. Large data is defined as a large amount of data (usually unstructured) that requires us to rethink how to store, manage, and recover data. So how big is it? One way to think about the problem is that it is so large that none of the tools we use today can handle it, so the key to how to digest the data and transform it into valuable insights and information is transformation.

In short, large data has attracted the attention of various it vendors, the large number of reasons for the attention of two, a complex data structure, data mining process is difficult, second, the large volume of data, and update fast, processing timely requirements particularly high.

  

(Responsible editor: The good of the Legacy)

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.