Large data analysis is driving the rapid growth of public cloud computing. Why? Because it solves practical problems, provides real value, and is easily deployed in the public cloud.
We can see this fact from the concrete data. According to the Marvell Business Institute data, the top 50 of the public cloud providers in 2013 http://www.aliyun.com/zixun/aggregation/30439.html "> Revenue soared 47% per 6.2 billion in the quarter.
These latest data show that public cloud providers are using data to drive their businesses, acquire new customers, and expand features and functionality. Although public cloud customers want to store and compute services, many customers who have recently deployed large data systems have discovered that public clouds are the best and most cost-effective platform for large data use.
Amazon cloud Computing [note] services, public cloud providers such as Google and Microsoft provide their own large data systems in cloud computing, whether nosql or SQL. Unlike DIY big data, DIY big data means allocating most of your data center to processing large data and, in some cases, spending millions of to buy database software.
Large data is driving the public cloud deployment, mainly for the following reasons:
Cloud computing costs are a small part of buying large data resources.
Over the past few years, cloud computing has become better for cloud computing and cloud computing to enterprise data consolidation, making it easier for companies to build large databases in cloud computing and synchronize them with any number of business databases, based on cloud computing or internal deployment.
In most cases, a public cloud can provide better performance and scalability for most large data systems, because they can provide automatic scaling and provisioning.
So, big Data + cloud computing = Make a perfect match? For new technologies, we always have problems, but in this case we don't have many obstacles. Large data will continue to drive more public cloud use in the future.