Big data in 2012 was thoroughly at the end of a fire. From your corporate executives to U.S. President Barack Obama is talking about big data. But are these arty people talking about large numbers able to accurately define the data for big data? Indeed, even experts seem to have a very divergent definition.
2012 Large Data Industry overview
We spent a year with suppliers and some large data early adopters on the issue of large data. We interviewed businesses that were not web-based, as well as network-dependent businesses such as LinkedIn, Eventbrite, Kaggle and Match.com.
The latter group of companies that rely on the network are actively promoting Hadoop and machine learning, the business is not a web-based enterprise is at a loss for Hadoop, they may be excited about the possibility of large data to bring enterprise competitiveness, but have not yet prepared the enterprise-wide relevant strategies to achieve meaningful initiatives, such as required investment technology, employing data scientists, etc.
It is worth mentioning that some companies do not have large data analysts, but they are aware of this. "Big data is really, really hard. "A lot of people tell us.
That's why we say that history will eventually show that 2013, rather than 2012 years, is going to be the year of big data. In 2013, companies will talk less about the topic, but they will start using large data analysis and gain tangible benefits from big data analysis.
Although we would like to make bigger and more specific predictions, we will leave large data analysis work to the experts in the market.
2013 Large data projections
We asked the interviewer a simple question, "How do you predict the future of big data in 2013?" "Some of these ideas are rather interesting, confusing and even provocative," he said.
John Schroeder, founder and CEO of MAPR Company
Using large data generates more revenue than cost-saving applications.
Hadoop will be an alternative to other large data analysis.
Hadoop expertise will grow rapidly, but there is still a shortage of professionals.
The sql-based Hadoop tool will continue to expand.
HBase will become a popular Blobstores platform (BLOB or binary large object).
Hadoop will be used more in real time applications.
Hardware will become an optimization using Hadoop.
HBase appears to attract lightweight OLTP platforms.
Opera Solutions chief Strategic officer Laura Troux
Wall Street will use data assets as an enterprise value
Over time, Wall Street will increasingly use "data assets" as a corporate value, just as they have used brand assets in the past. The ability of a company to collect and utilize a large number of exclusive forms of data will form a new axis, thus forming the company's long-term value.
Large data applications will be a big trend in the 2013
Big Data can help us find new and different answers, but in many markets, industries and research areas, as the impact of large data spreads, the answers to the right questions need new approaches.
In the 2013, the real money would not be data management platforms like Hadoop and NoSQL, or how the enterprise collects and processes data. Since most businesses naturally eliminate and replace equipment, frequent replacement of databases and storage infrastructures will not be truly innovative. Real emerging markets and lucrative opportunities, or innovations in this area will be in the field of large data application functionality: Custom applications that help you quickly answer specific areas of the problem.
Hortonworks President Herbcunitz
"Solution" with vertically arranged apachehadoop
At last year's Hadoop summit, Jeffrey Moore said that Apachehadoop crossed the chasm and we knew we were in the mainstream vertical solution.
As more and more enterprises succeed, we will see more patterns and solutions emerge, and finding the right customization in a particular industry is a challenge. As system integrators and consultants become apachehadoop experts, they will package solutions and we will see the emergence of these vertical solutions. The system for economic growth is a core strategy.
SAP Large data policy leader David Jonk
In-memory computing will become the cornerstone of a large data project, and timeliness is critical in every large data project; Vendors who have not yet announced memory-capable computing will eventually join the ranks. In order to find the "killer" of competitive advantage, the enterprise will develop strategy around the personalized consumption experience.
Privacy and other social issues related to big data will begin to get more attention at the end of 2013 as the public begins to understand how much personal information the enterprise collects and can access.
Platfora founder and CEO Ben Vitte
I predict that the leading companies will usher in the "Big Data Structure" in 2013. Large data models will become an increasingly important idea in many big banks and internet companies, rather than a painful manual operation between islands of data warehouses, so large data models can replicate all of your interesting data to a unified Hadoop based warehouse. This is the basis for a more agile exploration, discovery and analysis in the 2013.
Splunk, vice president of product marketing Sanjai Meta
The big data conversation will turn to
In 2013, discussions around large data will shift from a focus on large data and infrastructure technologies to the use of large data and new applications/methodologies, and the use of large data for specific purposes. Basically, we will hear and see more large data rationalization software.
For example, Cars.com, through analysis of enterprise data with Splunk Enterprise, increased revenue, better maintenance of its website and enhanced user experience. Next year, more users like Cars.com will talk about large data and business decisions to analyze large amounts of machine-generated data, which is what Splunk calls business intelligence.
Fractalanalytics founder and CEO Sricante Villamacani
The shortage of talent will be more acute, posing a real threat to the growth prospects of some companies.
AI will rise in the analysis space. The field of computer science, artificial intelligence, machine learning, game theory will play a greater role in large data analysis.
Personal (self) analysis will rise. More and more companies will provide data that consumers can analyze to control their behavior and personal life.
Companies will develop clearer privacy policies that give consumers more control over what they share. Specific consumers will actively manage what they share with others.
Large data analysis in various industries will usher in more applications. More and more enterprises will not be satisfied with large data management capabilities and seek outside experts.
Mobility analysis significantly increased. Mobile-driven analysis will change consumer information and consumption habits.
More intelligent devices and appliances appear to have a large degree of embedded analysis.
More focused on real-time analysis, although I am not optimistic about its progress in the year will be great.
Product analysis companies that cannot handle large amounts of data, variety or speed will be eliminated.
Alpinedatalabs Chief Product Officer Stevehillion
Commercially distributed Hadoop began to dominate. As more and more enterprises begin to focus on Hadoop, they will pay for the commercial version of full support. The maturity of these commercial versions makes it possible for us to see some mergers and possibly even large-scale acquisitions.
Bi is more mature. As traditional bi vendors integrate into Hadoop to produce their products based on the latest SQL interfaces, new vendors like Datameer and Platfora challenge the limits, creating conflicts that ultimately benefit consumers. In addition to basic BI, vendors will support more advanced visualization and large data exploration in new ways.
Hadoop will find its niche in sandbox data science. This year, Hadoop strives to go beyond simple batching and advanced analytics as a platform to realize its potential. For those who want to go beyond the batch and go beyond the basic reports, such as Thinkbiganalytics, SAS, and Alpinedatalabs will eventually enable everyday users to get insights from their big data.
The Data warehouse is transferred to the cloud. Larger companies will keep their local warehouses, and smaller companies and early adopters will increasingly move data assets to the cloud.
The challenges of Hadoop will begin to emerge. Users will reach a point of frustration with performance constraints, version confusion, and various standards and interfaces. Competitors ' technologies and platforms will take full advantage of leverage, surpassing HADOOPHDFS performance limitations, so all large data platforms will usher in more innovation.
(Responsible editor: The good of the Legacy)