Analysis of 2014 large data Market forecast

Source: Internet
Author: User
Keywords Large data become large data market Forecast

According to IDC, a market-research firm, the 2015 big data market will grow from $3.2 billion trillion in 2010 to $17 billion trillion, with a composite annual growth rate of 40%. Large data is a huge new area in which datasets can grow so large that it is difficult to use traditional database management tools. The new tools, frameworks, hardware, software, and services needed to address this problem are a huge market opportunity. As enterprise users increasingly need continuous access to data, good large data toolset offers scalable, high-performance analysis at the lowest cost and near-real-time speed. By analyzing this data, enterprises can gain greater intelligence and competitive advantage. The following is a forecast of the 2014 data markets by John Schroeder, co-founder and CEO of Hadoop and MAPR, a big data professional.

1. SQL has the greatest potential for large data

The development of SQL for Hadoop (distributed computing) enables business analysts to use their skills and the SQL tools they choose to execute large data projects. Developers can choose Apache projects such as Hive, drill, and Impala, and select proprietary technologies from companies such as HADAPT, HAWQ, and splice machine.

2. Even so, SQL faces challenges

SQL requires data structures. Centralized structured data can cause delays and require manual management. SQL also restricts the profiling type. Excessive emphasis on SQL will delay the Agency's efforts to fully utilize its data value and delay response.

3. Identification is a major data security issue

With the heavy attack of the access control capabilities provided in Hadoop (distributed computing), the mechanism quickly realizes that line-level identification is a necessary foundation. Without adequate identification, any more advanced controls can easily be bypassed, hampering scheduled security plans.

4. Data errors become learning opportunities

There will be many data errors in the 2014 organization. Will the data error indicate the underlying source system problem? Is data error caused by deviation in downstream analysis? Will data errors indicate definition differences or lack of consistency across departments and business units? 2014 will see the problem of resolving data anomalies.

5. The presence of a running Hadoop

2014 will see a significant increase in the production deployments of Hadoop in various industries. This will show the power of Hadoop in operation. There, production applications and analytics combine to provide measurable business advantages, such as those in applications such as customized retail recommendations, fraud detection, and the maintenance of test sensor data.

6. More data warehouses will deploy enterprise data centers

Data centers offload data extraction processing and data from the enterprise Data Warehouse to Hadoop. As a core center of business, data centers are 10 times times less expensive, enabling more analysis of additional processing or new applications.

7. New data-centric applications will be mandatory

The ability to use large data will become a competitive weapon in 2014. More companies will use large data and Hadoop to accurately target individual consumer preferences to pursue lucrative additional sales and cross-selling opportunities, better mitigate risk, and reduce production and overhead costs.

8. Data becomes the core of the data center

The organization will transition from developer to large data plan. The IT department will increasingly assume the task of defining a data infrastructure to support multiple applications, focusing on the infrastructure needed to deploy, process, and protect the core assets of an organization.

9. Search will become a unstructured query language

2013 has a large number of SQL plans for Hadoop. 2014 will be the focus of this unstructured query language year. Integrating search into Hadoop provides a simple and intuitive way for enterprise users looking for important information. Search engines are also at the heart of many discovery and analysis applications, including recommendation engines.

Hadoop will gain status

Hadoop will continue to replace other IT spending, subverting enterprise data warehouses and enterprise storage. Oracle's main revenue target, for example, has not been met for 5 quarters in the past 10 quarters. Teradata has not achieved revenue and profit targets for 4 quarters in the past 5 quarters.

Hadoop still needs help to become a mainstream application

More organizations recognize that Apache Hadoop itself is not ready for enterprise applications. Apache Hadoop is not designed for unified enterprise IT processes, such as system management or disaster recovery. Companies will continue to drive hybrid solutions that combine architectural technology innovation with the Open-source software of Apache Hadoop.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.