Will the world be destroyed in the 2012? Whether you have a ticket or not, next year there are many new trends in the IT field that deserve attention. The boom in cloud and big data has been unstoppable and will continue to develop, so what kind of destruction and rebirth will happen in the dark waves?
Big Data fast growth Hadoop should rise
2011, Cloud computing Belt hot Big Data, 2012, big data will fry high Hadoop.
2011 Large Data technology has stood in the forefront of the storage field, the various analysis of the data explosion trends, making large data inevitably become a large number of manufacturers a new promotional points and strategic objectives, reminding people to change the perspective of PB-level storage.
Mainstream storage vendors, including EMC, IBM, HP, Oracle, and NetApp, have rolled out their own big data strategies, just as the cloud has become more popular, with large data areas becoming more congested and manufacturers adjusting their positions and strategies to preempt their opportunities.
So what are the new changes and trends in the big data field of the 2012? We see that the growth of large data makers is driving the rapid rise of Hadoop.
Hadoop is a software framework based on free license, supporting data-intensive distributed applications, built on MapReduce technology, enables applications to handle thousands of of nodes and PB-level data, makes data processing and data analysis more convenient, and applies to business models that will change many businesses in the enterprise, It includes but is not limited to distributed computing Hadoop.
Hadoop has irreplaceable advantages in scalability, robustness, performance and cost, and in fact it has become a major data analysis platform for current Internet enterprises.
Hadoop seems to have become ubiquitous, and EMC, Dell, IBM, and even Microsoft are already among the Hadoop camps. For example, EMC has launched the Apache Hadoop open source software for data-intensive distributed applications as well as high-performance Hadoop dedicated data-processing equipment--greenplum HD Data Computing Equipment (data Computing appliance).
Dell has announced a new partnership with Cloudera to join the increasingly large Apache Hadoop club.
IBM uses IBM Infosphere biginsights software to run Hadoop on SmartCloud Enterprise.
Even Microsoft has begun to dabble in Hadoop, announcing the integration of Hadoop-as-a-service services in Windows Azure and SQL Server in 2012 for companies that handle large data on their platforms.
Many vendors embrace Hadoop, stating that customers and developers need tools to handle large numbers of data.
In fact, there are many research reports that say that many enterprise organizations are considering or are using the Hadoop platform for Data mining, performing a large number of previously impossible data analysis, dealing with unstructured data and making better use of computing resources.
And to make the most of Hadoop and similar technologies, software developers have developed a wide range of technologies in the open source community. Although open source technology has not been commercially supported. But according to IDC estimates, at least three business companies will give Hadoop support during the year. At the same time, many vendors will launch analysis tools with Hadoop components that help companies develop their own applications.
In the long run, Hadoop will also evolve to a stage where people don't know much about it, but almost everyone has heard of it. Once you encounter a large amount of unstructured data acquisition and processing, Hadoop will have a great deal of use, I believe next year will be the Hadoop show.
Cloud connectors are spawned by mixed clouds
The 2011 is a raging storm, and the entire IT field has changed dramatically. So what are the new trends in cloud computing in the 2012?
We see that the debate is still going on about what kind of cloud to use, and it may become more intense in the future. But at the same time, as IT departments begin to combine public and private clouds, the utility of the hybrid cloud is emerging in the enterprise. Now there are a lot of vendors pushing the hybrid cloud rather than simply pushing the public cloud model.
Gartner is the latest to say that the hybrid cloud is "the main focus of the 2012" and that the mix cloud accounts for 20% of the corporate cloud. The hybrid cloud has its own advantages, but how does an enterprise move a service from a private cloud to a public cloud? Who is going to do this task? A series of problems lay before us.
Therefore, we need a new technology that provides a connector between the internal cloud and the external cloud to achieve virtual machine migration between the public cloud and the private cloud.
The Vcloud connector, introduced by VMware, appears to be the beginning of this trend. Cloud connector Plug-ins and hosting services are paving the way for businesses to move towards a hybrid cloud.
In our view, this connector should be able to view virtual machines in private and public clouds, migrate in different environments, support multiple hosting providers, and maximize deployment and use.
However, because you want to take into account the required network bandwidth and the performance problems that may be caused, use this type of connector software as carefully as possible before you have a lot of prototyping and testing production phases complete.
In any case, this type of architecture will be a key to cloud migration. After all, the ability to provide a variety of options for deployment platforms, whether public or private, can migrate between different types of clouds based on business requirements, which is a great attraction for businesses. It is believed that more manufacturers will follow up quickly in 2012 to develop this kind of connector technology to promote hybrid cloud deployment.