Guidance: With the implementation of the in-depth, cloud computing is also exposed to the short board, which is data security, transmission bottlenecks, and so on, these short boards, in the final analysis, is how to deal with massive data processing problems.
Cloud computing has long been a dream for it people to turn computing power into water, electricity-like infrastructure, now, this dream has finally come true, as governments and it leading companies Google, Amazon, IBM, Microsoft and other companies, cloud computing into the "ordinary people's home" is not far away. However, cloud computing is not perfect, with the implementation of in-depth, cloud computing short Board also exposed, that is, data security, transmission bottlenecks, and so on, these short boards, in the final analysis, is how to deal with massive data processing problems.
I have long been concerned about the microelectronics field, recently I noticed that some EDA tool manufacturers began to focus on cloud computing, in their communication with them, they generally think that cloud computing is the way to change the future of EDA tools, but their concern is how in the cloud to respond to the needs of users in time, How to give users a good software as a service (SaaS) experience, which is actually a huge amount of data processing bottlenecks.
In fact, the server architecture in cloud computing is very different from our current server architecture, the traditional server is CPU-centric, and the future cloud computing server will take into account CPU and GPU, especially to play the GPU performance. ”
Cloud computing provides on-demand (and sometimes even free) resources, and previously, enterprise organizations needed to buy their own clustered computing servers to get these high-performance computing resources. Graphics processors are ubiquitous and adept at delivering high performance for the execution of some type of code in fact, graphics are parallel computing models and a solution to floating-point math problems. Many of the code and algorithms in High-performance computing, as well as other applications, are parallel floating-point math problems. Visible, to solve the cloud computing in the massive data processing needs to play a strong GPU resources, in this respect, AMD find the right direction.
At the May 19 China Cloud Computing Conference, AMD academician, cloud computing experts and server platform architect Bob Ogrey revealed that at present, AMD on the consumer side has introduced a fusion CPU and GPU of the APU Accelerator processor, the future AMD in the server area will also move towards integration, Only AMD has the advantage of CPU and GPU. Bob emphasized that integration is not the hardware integration, is to integrate the CPU and GPU on a chip, to achieve the integration of computing environment, that is, software support to achieve integration. In this regard, AMD has achieved the support of DIRECTCOMPUTE11 and OpenCL, able to develop across platforms.
AMD Global Senior vice president, Greater China President Deng pointed out that, as the world's only one with CPU-leading technology and GPU graphics processor leading technology manufacturers, AMD to achieve CPU and GPU real convergence of the accelerated processor Apu, so that AMD in the cloud terminal has a great advantage, " The good cloud computing service experience is that we have to understand what households are going to do with the cloud, and the future cloud terminals will be an easy to use product with advantages in both graphics and video.
AMD in the cloud terminal aspect, because has the APU so the CPU and the GPU revolutionary true fusion chip, has the ultra low power consumption, the ultra strong performance, the All-weather endurance characteristic, supports the GPU acceleration and the heterogeneous computation, enables us to handle the cloud content and the service, will be more handy. We will have more low-power, High-performance APU products in the future, for cloud computing terminal products inject new vitality, this is what we see in the terminal development. "he said.
On the other hand, cloud computing also puts higher demands on the server. Deng describes that the workload of cloud computing servers fluctuates depending on the network situation (busy or idle), for example: Facebook is particularly busy when everyone comes home at night, and everyone is less busy after bed. The best way to deal with workload fluctuations is to leave enough performance "performance headroom" to handle the load during peak rush hours. The best way to have a "performance cushion" is to have more cores for the CPU. More CPU cores mean better energy efficiency, lower power consumption per CPU kernel, so cloud users do not have to procure too many server nodes. This is precisely the advantage of AMD server processors.
Deng pointed out that the core number of up to 12 cores, power consumption of up to 32 watts of AMD Hao long multi-core processors bring ultra-high energy efficiency and low-cost. A 16-core server processor based on the "Bulldozer" architecture will be launched later this year, with significantly improved performance and low power consumption. In addition, AMD Hao long processors have excellent scalability and flexibility to help users maintain data center consistency and compatibility, under the same architecture can enable the server from 1 slots to 4 slots flexible deployment. Deng revealed that at present, more than 2 million AMD processors worldwide have been used in the cloud computing environment. For example, Microsoft's cloud services platform, Windows Azure, uses AMD's Hao long processor.
According to the authoritative research organization forecast, the cloud computing will achieve the high speed growth in the future, the annual growth rate will reach 26%, this is the traditional IT market growth rate 5 times times.
(Responsible editor: Lu Guang)