Usually the technical discussion around this business, often easily because involves complicated business and into a disorderly state. And IT departments and the media are trying to predict the future of a new wave of applications, cross-platform application of the feasibility of constantly questioned, and around the emerging technology is constantly hype. The current society is becoming more and more inclined to focus on microscopic trends, ignoring the bigger and more pressing technical problems. There is no bigger issue than cloud computing technology at the moment.
Media coverage of the cloud over the hype is already close to saturation. Is cloud computing too hype or real? How do you define cloud computing? which applications are best suited to the cloud? In fact, the question that we should most discuss is what major challenges cloud computing can solve?
The answer is: it can solve the IT energy crisis.
The term "crisis" may sound exaggerated, but it is very realistic. Online data explosion growth is well documented, while IDC estimates that the "digital world" will reach 1,800,000,001,000 trillion by the end of 2011. In addition, as of 2007, the reality of the US data center industry was that its data storage, management and data extraction consumed nearly 2% of the nation's total electricity consumption. Storage, management, and data value extraction require a lot of energy, and while data growth and energy consumption can be controlled, the growing trend is unstoppable. Innovative solutions are needed to make appropriate adjustments and to address this trend.
Cloud Computing provides a new solution?
The vast majority of people will accept that cloud computing has evolved from previous models to common computing and SaaS models like today. The key to cloud computing design is to provide instant scalability to businesses of all sizes in a very flexible way. The key lies in the advantages of flexibility, simplicity and cost-effectiveness. However, in the past few years, although cloud computing has become more and more widely adopted, other advantages are rarely reflected in corporate it.
First, as the economic downturn shakes the foundations of the IT market, it is important to know how to get more investment. Second, since companies are actively working to reduce costs and maximize resources, they face the reality of increased demand for computing and rising energy costs. It is difficult to manage the increasing demand for computing at the same time as reducing costs.
To cope with rising energy consumption, many companies, even federal agencies, have created a series of measures to try to cope with the rising power energy problem. Government agencies have publicly stated that their aim is to reduce the cost of energy in the data center percent. In addition, some specific institutions, including New York State's Nyserda, provide up to $10 million trillion in financial incentives to reduce energy consumption in data centers.
Since early users of cloud computing have fully experienced their powerful, user-friendly solution effects, they have brought significant reductions in energy costs. So, cloud computing is destined to meet more user needs.
But it's not as simple as it sounds. In theory, moving the computing workload of the Local data center to cloud computing simply concentrates the energy consumption scattered everywhere in the cloud. But the energy crisis has not been resolved, and it is just being consolidated. That's why energy efficiency is the biggest problem associated with cloud computing, forcing cloud providers to implement very high-performance server solutions. Such protection measures are not only feasible today, but must also be feasible in the future.
So how can this be done?
First, examine the server's processor. At the "core of the cloud" there are thousands of tiny microprocessors that handle the backend computing work. Traditional it, before the energy crisis, we experienced the GHz war. At the time, there was a thought that the processor would be getting faster, and even one day we would have a 10 MHz chip that would change the world forever. It seems ridiculous today. As we all know, a chip can be quickly integrated into the server. It is therefore necessary to rethink the process of balance.
For cloud providers looking to build a server farm with tens of thousands of servers, the key is to provide the appropriate application for the customer. Sequential performance while maintaining the energy efficiency of these servers. This is a very precise balancing act, and it feels like a scientific experiment (looking for the right mix of ingredients--price, performance, and Power).
With that in mind, many CIOs and IT pros find themselves often puzzled: "Can I use just one ultra-low power processor in cloud computing, as it is used in laptops or mobile phones?" The answer is yes, but there is something missing. The most important indicator is the best solution to the trade-off between price/performance/power when building a large data center. With the deployment of cloud-based Technologies based on mobile processors, you don't have to worry about price and power performance. What's more, the risks inherent in a platform are not tested or unverifiable. These are important issues to consider.
So。 The solution is to create a highly energy-efficient, fully cloud-based, processor-balanced environment? When it comes to power management, making the server platform and processor smart, this becomes increasingly important. The workload running in the cloud environment is discontinuous and the task is heavy, as you can see in the HPC environment. Therefore, it is necessary to continue to make the server smart enough to know that when a workload is reduced to a certain extent, you can effectively put some of the platform "power off" to save energy, and then connect the part of the power backup-from static to active, rapid and efficient. This is the focus area that the IT industry needs to continue to focus on in the future.
In addition to processors, the entire data center design and location strategy needs to be improved. Specifically, the most expensive part of it comes from configuring and providing power and cooling. But how will the data center evolve to deal with this? In the long run, the data center design requirement is that it will run in a higher temperature environment, reducing cooling is required. Free air cooling may be a better option over the next 5-10 years, without expensive air-conditioning, which will have a significant impact on IT spending in the data center.
However, in the short term, we can expect the data center to be located near the power plant and natural water bodies. The former helps reduce losses in power transmission, while the latter reduces the need for more advanced cooling techniques. If we migrate many small data centers scattered around the world to a few cloud data centers, we must think about how to really optimize these sites. In the years ahead, many large data centers are likely to gather in specific areas of the world, based on their inherent strengths. In the Virginia State of Ashburn, various ancillary facilities have begun to appear.
The debate about what role cloud computing will play in the future in the IT industry will be intensified, but we cannot ignore the obvious and often overlooked fact. Energy consumption is a very serious and imminent problem in the IT industry, and key considerations in the future data center design process. Whether or not you believe that the future data center will adopt cloud technology, enterprises must take new measures to reduce costs and environmental pressures, reduce server energy consumption.
And while energy efficiency has been an important subject for the IT industry for years, we are still incredibly at the edge of this discipline. As we prepare for this change, it will require the joint efforts of technology suppliers, the federal government and local regulators, and IT professionals, and of course CIOs will need to put them on the corporate agenda.
(Responsible editor: Songtao)