Where does the management of the big data age begin?
Source: Internet
Author: User
Keywordsnbsp; large data very now if
The growth of data volume into a sudden event is the thing of the last two years. The social changes generated by the application of the Internet make a series of data start from the client rather than from the enterprise. The growth rate of the data volume increases with the new progression. On this basis, 70%-85% of the data is "a complex of multiple data formats." The management model of future data is very different from today. In addition, 87% of the database performance problems are related to the increase in data volume. This is a data survey based on Oracle. Gartner finds that the amount of data directly affects the performance of existing processing patterns. Therefore, the rapid growth in data volume, if the original management mode, the data are put together to save, the future will encounter more challenges. Because of the existing database structure and data management model, the data scale based on large data can not be satisfied.
If some companies use large data technology in advance, they can get some opportunities in the competition. One of Gartner's predictions for the next five years is a sensation. By 2015, 85% of the world's top 500 companies would lose competitiveness if they did not take big data. So big Data competition is a very important time and a very cruel thing. Now you really need to have a strong enough response.
, what is "Big data"? What is the problem with big data?
"Big Data" refers not only to the amount of data, but to a series of new challenges. The concept of Bigdata was first proposed by a Gartner analyst called Golary. He proposes that Bigdata faces three V Challenges: Data Volume (Volume), Data diversity (produced), and high speed (velocity).
on this premise, Gartner released 12 models of Bigdata last year. The most attention is the bottom of the range, which is about the first bigdata of the Quantitative indicators: data volume, data type and processing speed. The data management that the general enterprise faces is the data that the database, the structured data, and the beforehand installs the management software. Large data management is often the data we can not manage, such as from outside the enterprise, microblogging, social networking sites and multimedia and other carriers.
Data diversity will be a key to large data. It means that the generation of future data is a big difference. High speed is not an equivalent relationship to the CIO's attention to system performance. High speed here refers to the speed from data generation to final decision making for data, which includes the stored process, the computational process, the entire model, and the way in which the final results are presented. So, not only is computing power and storage performance a problem, it's more about how to protect the processing speed of data management. In large data problems, speed is often a life-threatening situation. For example, the prediction of disaster, when the disaster occurred, to the extent of the disaster, the impact of the region, the long-term impact of the need to quantify. This is a typical application of large data, if not calculated in a short time, then the data is useless.
the complexity of the mass, diversity and high speed. So far it's hard to find a good solution that can handle all the data format problems. Although there are many different industries that are beginning to adopt standardized measures to avoid this problem, it is still a serious problem.
now has a data standard dicom in the medical world, the image Transfer protocol for medical data. It was only for pharmaceuticals, but now the medical profession, including hospitals, is using the data format, which is a good trend. Through a neutral organization, the standard of data format is developed to solve some data complexity problems. But if you put it on a larger scale, for all businesses, the data format doesn't exist. There are many challenges in terms of specific definitions and applications, although there is now a SML format for this relatively wide application.
Limit Information Management: 12 quadrants
"Big Data" will push the limits of the needs of all aspects of information management. Access Rights management and control, including data sensitivity ratings (classification), shared Protocol (contracts), hotspot data (pervasiveness), technology implementation (Technologies). This level is rarely mentioned by users, with such a large amount of data, the future will be a very serious problem. Data sensitivity rating, putting all the data together is the disaster of data management, the premise of data management is that all data produce different value, different period of value is not the same, must define what is valuable, which is not valuable, but also to define the time limit of value. Sharing protocols, how data is submitted, how it is submitted, and how it is submitted, need to be determined by law in the form of a contract. Hot data, large data times hot data is changing. The degree of hotspot and the time ahead are important for access and control. Technology implementation, the ability to manage large data is what technical means.
Quality management includes fidelity (Fidelity), data correlation (linking), data validity (Validation), and data Expiration (perishability). Above the access authority is quality management, which is an important concept in the original Data warehouse. When the data is brought in, the fidelity, the context of each data, will not affect the use of the context in the next scenario. The correlation of data, the results of a model of data combinations of different sources cannot be taken out of context. The effectiveness of the data, from the point of view of time and application scenarios to manage these data, this is a higher level. These four quadrants are very important, and now you are focusing more on the problem of data volume.
large data generates new requirements for data Center architecture design
existing data centers based on relational architectures are difficult to meet future requirements. Large data will have great business opportunities if fully utilized. Business forecasts, for example, have been blamed on inadequate regulation and serious problems with the design of data models in the 08 crisis. If a more sophisticated data model is devised, will it be possible to predict the resulting crisis and quantify the consequences and help people make better decisions? Big data is important not only for businesses, but also for a country, or even for global stability.
There are many new types of applications, such as futures, stocks. If you can predict the market trend a little earlier than the competitor, there will be very important business value. In terms of challenges and opportunities, a new need for a large data architecture will arise. The complex model of large data is not expanded on the original system, and it can be realized by adding some new applications. For example, it is hard to cater to this demand with the gradual expansion of data centers. Whether it is database, storage, data computing capabilities, the current popular data center technology is difficult to meet the needs of large data. There is a need to consider revolutionary changes to the entire IT architecture.
The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion;
products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the
content of the page makes you feel confusing, please write us an email, we will handle the problem
within 5 days after receiving your email.
If you find any instances of plagiarism from the community, please send an email to:
info-contact@alibabacloud.com
and provide relevant evidence. A staff member will contact you within 5 working days.