"Tenkine Server channel April 23" The Big data application of the vast number has been beyond doubt. Corporate CEOs and CIOs talk about big data when they talk about business strategy and it model innovation. Large data drives Rapid Changes in infrastructure and $232 billion 2016, Gartner ("rapid change in the big-drive infrastructure", 2 016-year related IT spending will reach $232 billion trillion) the report points out that 2013 big data-driven IT spending is expected to reach $34 billion trillion, and by 2016 total global spending on large data will reach $232 billion trillion. Gartner stressed that large data analysis tools will be considered an essential investment in the 2014.
How to make full use of large data, mining the commercial value of large data, thereby enhancing the competitiveness of enterprises, has become a focus of attention. This is also the direction of Oracle's efforts.
Sishing, vice president of Oracle Company and technical general manager of Greater China region
A comprehensive solution will work.
At present, more and more enterprises will make the analysis result of large data as the basis for judging the future development. At the same time, traditional business prediction logic is increasingly being replaced by new large data forecasts. However, we should carefully manage the expectations of large data, as massive data can only further develop its business value only if it is effectively managed.
The most widely known definition of large data is the 3V characteristics of large data from Gartner: Huge data volumes (Volume), fast data processing (Velocity), variable data structures, and types (produced). According to this definition, you first think of the IT system has been difficult to deal with but can not ignore the unstructured data. In other words, large data not only deal with the analysis of transaction-type data, but also social media, E-commerce, decision-making support and other information are integrated in. Now, the distributed processing technology Hadoop and NoSQL have been able to store, process, analyze and mine unstructured data, but fail to provide a comprehensive solution to meet the customer's large data requirements.
In fact, large data ranges in general sense are broader, and any complex computation involving massive data and multiple data sources is a large data category, not just unstructured data. Therefore, such as telecommunications operators have a large number of users of various types of detailed data, mobile phone switch information, mobile phone in the network registration information, mobile phone billing information, mobile Internet detailed log information, user roaming information, user subscription service information and user basic service information, can be classified as big data.
Compared with the rise of cloud computing a few years ago, the way in which big data can achieve its business value may be longer. But business users are impatient, and more and more executives tend to use large data analysis as an important basis for their business decisions. In this context, we must find a comprehensive large data solution, not only to solve the problem of unstructured data processing, but also to extend the function to the storage of massive data, distributed collection and exchange of large data, real-time fast access to massive data, statistical analysis and mining and business intelligence analysis.
A typical large data solution should be a versatile platform solution that includes storage, computation, analysis and mining of structured data, storage, processing, and processing of multiple structured data, and business intelligence analysis of large data. This solution should have the following four characteristics: soft and hard integration of large data processing, fully structured data processing capabilities, large-scale memory computing capacity, ultra-high network speed access.
Hardware and software integration is the inevitable choice
We believe that the key to large data solutions is how to handle large scale data calculations. In the past, traditional front-end database servers and the architecture of back-end large storage have been difficult to store large scale data efficiently and maintain high-performance data processing. At this point, we let software and hardware integrate more effectively and collaborate more closely. In other words, we need hardware and software to meet the challenge of large data.
Oracle has always had the absolute advantage in the traditional relational database, but it has not been complacent. In the face of the big data boom, Oracle, based on the needs of users constantly new, will be in the field of data from the traditional relational database expansion to a comprehensive large data solutions, the industry's first through a comprehensive, software and hardware integrated products to meet the enterprise key data needs of the company.
Oracle is integrated in software and hardware to provide large data capture, organization, analysis and decision-making capabilities, to provide the enterprise with a complete integrated large data solutions, including the core products include Oracle large data machine, exalytics Business Intelligence cloud Server and Oracle Exadata Database Cloud Server.
The Oracle large data machine is used for the large structure data processing, which is designed to simplify the implementation and management of the large project, and its data processing results can be connected to the Oracle Exadata Database cloud Server by the ultra-high bandwidth InfiniBand network. Oracle Exadata can provide efficient data storage and computing capabilities, with oversized memory and fast flash memory, with unique hardware and software optimization technology, can be efficient processing, analysis and mining of large data. At the same time, Oracle provides highly efficient and convenient advanced data analysis software at Oracle Exadata and database software levels to enable faster and more efficient data analysis, mining, and processing.
With the rapid acquisition and organization of large data by Oracle large data machine, enterprises should make scientific business decision according to the comprehensive and real-time analysis results of large data. Oracle exalytics Business Intelligence cloud Server can run data analysis applications at an unprecedented speed, providing real-time and fast visual analysis for customers. Similarly, it is connected to Oracle Exadata via InfiniBand Network for data loading and reading, so that large data can be computed directly in memory to meet the demand of rapid response of data analysis show in large data age. Oracle Exalytics implements a new analytical application that can be used in heterogeneous IT environments to access and analyze data from any Oracle or non-Oracle relational data, OLAP, or unstructured data sources.
Oracle Large Data machine, Oracle exalytics Business Intelligence Cloud Server and Oracle Exadata Database cloud Server together, constitute the Oracle most extensive, highly integrated system product portfolio, providing enterprise with an end-to-end large data solution, To meet the needs of large data governance for the enterprise.
Adhere to the open strategy
From the current situation, in large data applications, only one manufacturer's products are difficult to solve all problems. Therefore, for large data solution providers, the adoption of an open strategy is an inevitable choice. Oracle adheres to a comprehensive, open and integrated product strategy. This strategy also applies in large data areas.
This is first reflected in the large data strategy that technically supports Hadoop and Open-source software. In addition to its integrated offerings, Oracle has a range of leading technologies to help users fully address the challenges of large data applications, including Oracle NoSQL databases and products for the Hadoop architecture.
The Oracle NoSQL database is designed to manage massive amounts of data and can help businesses access unstructured data and scale horizontally to hundreds of high-availability nodes. At the same time, the product delivers predictable throughput and latency, and is easier to install, configure, and manage to support a wide range of workloads.
Products dedicated to the Hadoop architecture can help organizations meet the challenges of organizing and extracting large data, including Oracle Data integration Hadoop application adapters, Oracle Hadoop loaders, and Oracle SQL Connector.
In addition, Oracle R Enterprise implements the integration of R Open Source statistics environment with Oracle database 11g, providing an enterprise-ready, deeply integrated environment for further data analysis.
It is worth mentioning that, in addition to the continuous investment in products and solutions, Oracle is committed to working with partners to develop large data solutions. At present, almost all Oracle partners are concerned about and testing large data solutions. Oracle is actively looking for more local partners to provide customers with more customized products and solutions.
All in all, the big data has been with cloud computing, social, mobile together, become the current driving enterprise IT model change important factor. Oracle Large data solutions can be integrated with other products across all levels of the IT architecture, with superior reliability, scalability, and manageability to provide the ideal it foundation support for enterprise IT development and even business development.
(Author: Oracle Sishing Editor: Xu Ming)