Application of cloud computing in the research and development informatization of large enterprises

Source: Internet
Author: User
Keywords nbsp for example a letter
Tags .mall accelerator access added application applications bandwidth based
Although it industry has different views on cloud computing (clouding Computing), it is still one of the hottest it vocabularies in recent years. What is cloud computing, and how does it help the enterprise research and development information that we are concerned about?

More commonly said, cloud computing is hardware + software + services. These three are not simply added, for the research and development of information, cloud computing should be defined as, adapt to research and development needs of hardware and software reasonable collocation, and for research and development to provide easy-to-use, efficient, low-cost information services.

For the enterprise interior, "Private cloud" is one of the trends of modern enterprise research and development information. This article from the desktop, computing, storage, visualization, network and so on several aspects of cloud computing in large enterprises research and development of information applications.

Desktop Cloud

Developers are involved in research and development activities on a daily basis through the computer's desktop system. The traditional approach is to configure a laptop, PC, or workstation as the terminal, mostly windows, based on the needs of the developers. As the price of hardware continues to fall, it seems as if the cost of this approach is reduced. However, the traditional desktop approach has higher operating costs and security risks. Because these desktop systems are distributed in research and development departments, departments, difficult to centralize management. Data stored in the personal terminals of the hard disk, and these hard drives often do not have good disaster-tolerant measures (such as viruses, hard drive bad), data security is not guaranteed, but also can not guarantee the physical security of terminal data.

Desktop graphics workstations, as well as the various types of research and development CAX commercial software installed on them, are still the necessary equipment for research and development engineers. But these companies spend huge sums of money to buy the software and hardware dispersed in the research and development departments, the use rate is likely not full. For example 3D CAD design and engineering simulation requires a professional 3D accelerator card, if the desktop graphics workstation distributed on the individual desktop, the user if not every day with 3D software, or travel outside, expensive 3D graphics accelerator card and workstation will be idle. Another example, a section may be based on a project to purchase a project simulation software, if only installed in this section, then once the project is over, this analysis software may be idle, even if other departments have project needs, it is likely inconvenient to use.
The desktop cloud will be a good solution to the problem of resource waste caused by the distributed desktop. For example, put these software and hardware into the enterprise's data center, so that the hardware and software through the enterprise resource scheduling system, configure a reasonable configuration strategy, so that users remote use. The use of software and hardware records and statistics, management and maintenance (such as air-conditioning, electricity, etc.) level, data security and so have been revolutionized.

Various IT vendors have presented similar "cloud" desktop solutions. For example, HP's Cci/vdi/sam/rgs program and Sun Ray. involves Blade pc, Blade Workstation, Desktop virtualization, workstation virtualization, remote graphics compression and transmission technology, thin client technology.
The implementation of the desktop cloud can be gradual, for example, the desktop of the person with the highest data security requirements should be moved to the blade workstation or the blade pc in the computer room, and the graphics workstation can be moved to the data center room for remote use, and other personnel will be replaced with the blade PC (or blade workstation) with the replacement of personal PC or workstation The pattern of the thin client.

Calculate cloud

Engineering simulation has been used more and more in enterprise research and development, how to solve bigger and more complicated problems in a shorter time is still a headache for many researchers. Single workstations are often unable to meet the needs of high-performance computing for such software, and more and more enterprises have been or are considering building engineering simulation high-performance Computing (HPC) systems. Such systems are not simple clusters or minicomputer purchases, and the following are often easily overlooked:

1, according to the characteristics of the engineering simulation Solver hardware architecture, including server type, interconnection structure, storage architecture. For example, some applications require a large amount of memory, the general thin node cluster can not be solved. For example, some applications need low latency interconnection for large-scale distributed parallel solution, then Gigabit Ethernet can not achieve the ideal parallel efficiency. For example, some software in the operation of the need to continuously read and write disk system, common NFS and other file systems become a bottleneck in system performance.

2, the choice of the appropriate computing resources scheduling software, and engineering simulation software "intimate" integration, configuration optimization calculation scheduling strategy, so that the most important and most urgent task of computing priority software license and hardware computing resources. The scheduling of hardware resources we think that the current relatively mature, whether commercial or open source can do very well, but the software license scheduling, due to the wide range of application software, license format five flower door, it is likely to need to do some customization or code development work.

3, HPC system and engineering simulation software before and after processing, data management software, process management software and other interfaces. Engineering simulation platform is an important platform of enterprise research and development platform. How to embed HPC system into engineering simulation platform, let the user call HPC system in the most convenient way to solve, also need some customization or code development work.

If the HPC system solves the above performance optimization, resource (including license) scheduling and statistics, and integration with Pre-and post processing and data and process management software, we think this HPC system can be referred to as the "computing Cloud" within the enterprise.

The storage cloud

Data is the carrier of information, research and development information related data and the general enterprise information application of data compared to the larger data block (for example, the large project in GB simulation data), the data read and write performance requirements higher, security requirements higher. The concept of data lifecycle management proposed by some IT vendors is worth considering. For example, from performance, put the relevant data under development on the best memory for developers to quickly read and write (for example, based on Gigabit Ethernet, fiber, or InfiniBand storage), and put old research and development project data on the performance, but large capacity of storage (such as high-capacity SATA disk array, iSCSI protocol, etc.). Permanently save data that is not used for long periods of time in a CD or tape library.
Storage should also be integrated with desktop applications, computing applications, and web-based and database applications. For example, the traditional practice of engineering simulation is to use the Windows graphics workstation for pretreatment, generate the required input files, and then upload to the Linux HPC system for parallel solution after the download results to local. Not only file upload download consumes network bandwidth, but also easy to cause the duplication of disk space consumption. If you place a graphics workstation or blade workstation in the data center, it is possible to share a parallel file system with the HPC system at high speed, whether it is Linux/unix or windows. Let the compute node and the front and back processing nodes (workstations) read and write the same storage as the local hard drive. This avoids the problem of file transfer, and unified storage makes disk quotas and data backups easy.

If you take full account of the performance, capacity, security, and integration and ease of use of applications such as desktop and computing, we can call it the "storage Cloud" of enterprise development.

Visual Cloud

If the development of the need to use a large number of graphics processing, and a graphics workstation can not meet the requirements, such as insufficient memory, 3D graphics speed card processing capacity is not enough, it is likely to use a better performance of the server or cluster with multiple 3D accelerator card to deal with massive graphics data. A "visual Cloud" server can also be used as part of the "compute cloud" to apply to normal floating-point computations when there is no "visual" requirement.

We agree with some in the IT industry that cloud computing is not a new concept, and it has many similarities to concepts or methods such as grid, ASP, SOA, and SaaS. The ultimate goal of cloud computing is to provide services for a particular business, such as the research and development information application described in this article. As a "private cloud" within the enterprise, there is no need to across in order to catch the trend, can be transformed according to the actual situation of the enterprise itself.

For small and medium-sized enterprises, by their own strength to build private research and development of information cloud computing environment may not be realistic. But the demand for research and development of small and medium-sized enterprises is the same as large enterprises, so we believe that it is necessary to build a public research and development information cloud computing platform, so that ordinary small and medium-sized enterprises can also at a lower cost, to cloud computing mode, the use of the past only large enterprises can consider research and development Especially for a certain industry (such as mold industry), in this industry more intensive industrial park, the establishment of such a public platform more practical significance.

We think that the development of information public cloud computing platform in addition to the "private cloud" characteristics, but also consider the following features:

1. Network bandwidth

Research and development information involves a large number of data blocks, data between the end user client and data center server transmission needs a stable bandwidth, remote graphics operations (such as 3D) requires a lower network latency. With the expansion of Internet infrastructure, the bandwidth and quality of Internet access has been increasing, and the cost is decreasing. In particular, the rapid spread of 3G wireless communications will make mobile broadband become a reality. 3G of bandwidth up to 7~8mb/s or more than ordinary home wired broadband (currently most 2M) access. We have tested the use of the family's 2M community broadband, remote operation company (the same city but 3 districts apart) a graphics workstation for complex 3D model operation, the 3D model of scaling, rotation, translation, and so slightly delayed, but smooth operation, smooth, completely acceptable. If the public simulation platform is placed in the data center of the industrial park, the enterprise terminals in this industrial park have fewer routes to the data center server, higher bandwidth and better network quality. Therefore, we think that the industrial park network bandwidth enough to meet hundreds of accounts at the same time access to public research and development services platform software and hardware resources.

2. Data security

Research and development of data increasingly become the lifeblood of enterprise development. How to guarantee the security of the data on the public service platform is one of the important things to consider.
General Small and medium-sized manufacturing enterprises generally will not spend huge sums of money to buy mature network security and data security-related software, hardware, disaster tolerance, backup and other equipment, means and systems, often in the professional data center is far away. Therefore, the data placed in the data center should be more secure. Small and medium-sized enterprises to build private cloud is also relatively high cost of storing data on the hard disk is often on the developer's desk in the graphics workstation, the flow of people will cause potential data loss, too harsh IT security management system may cause personnel trust crisis. The use of public service platform, the data are all in the data center, can effectively avoid the above drawbacks, but make the enterprise important research and development data more secure.
In addition, from the technical and management system, the public cloud computing Services platform will adopt industry-leading security technologies and tools, such as digital certificates, VPN, data encryption, data backup and recovery. Can even consider the use of a part of the bank or military security technology, the platform before the formal operation will be recognized by the industry's security certification, small and medium-sized enterprises to use this platform for research and development activities in the information security concerns.

3. Cost settlement

The development of information public cloud computing platform will be similar to the token point of the way billing. By the user to the hardware resources (such as graphics workstation logon and logoff, using statistics on the number and time of computing servers, software resources (such as the number and time used for software module licenses), the use of storage resources (storage space occupied), and an adjustable token algorithm, Calculates the token value used each time. Token can also consume other services within the user's platform, such as e and technical transactions.

Research and development of information cloud computing, whether it is the private cloud of large enterprises, or the public cloud of small and medium-sized enterprises, will gradually change the use of information tools in the research and development activities of the way, so that enterprises with lower costs, higher efficiency, more secure research and development activities. Pera Asia-Pacific actively work with well-known it vendors at home and abroad to promote and apply cloud computing in the enterprise research and development of information applications. To help users recommend, planning, implementation of enterprise information cloud computing infrastructure platform.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.