Talking about the Application of Cloud Computing in the Research and Development of Large Enterprises

Source: Internet
Author: User

Although the IT industry has different views on cloud computing, it is still one of the hottest IT vocabulary in recent years. What exactly is cloud computing, and how does it help us to focus on enterprise R&D informatization?

More popular, cloud computing is hardware + software + services. These three are not simply added. For R&D informatization, cloud computing should be defined as a reasonable mix of hardware and software to meet R&D needs, and provide easy-to-use, efficient, and low-cost information services for R&D.

For the internal enterprise, “private cloud” is one of the trends in the development of modern enterprise R&D information. This paper elaborates on the application of cloud computing in the research and development of large enterprises from the aspects of desktop, computing, storage, visualization and network.

"Desktop Cloud"

R&D personnel participate in R&D activities every day through the computer's desktop system. The traditional approach is to configure a laptop, PC or workstation as a terminal for each developer's needs. The operating system is mostly Windows. As hardware prices continue to fall, it seems that the cost of this approach has also decreased. However, the traditional desktop method has higher operating costs and security risks. Because these desktop systems are distributed in various departments and departments of R&D, it is difficult to centrally manage them. The data is stored in the hard disk of the personal terminal, and these hard disks often do not have good disaster recovery measures (such as viruses, hard disk bad sectors), etc., data security is not guaranteed, and the physical security of the terminal data cannot be guaranteed.

Desktop desktop graphics workstations, as well as various types of research and development CAX commercial software installed on it, are still essential equipment for R&D engineers. However, the software and hardware that these companies spend huge sums of money are scattered in various departments of R&D departments, and their usage rate is likely to be insufficient. For example, 3D CAD design and engineering simulation require professional 3D acceleration cards. If desktop graphics workstations are distributed on a personal desktop, users will use expensive 3D graphics accelerators and workstations if they are not using 3D software every day, or on a business trip. Idle. For example, a certain department may purchase an engineering simulation software according to a certain project. If it is installed only in this department, once the project is finished, the analysis software may be idle. Even if other departments have project requirements, it may be inconvenient to use. .

The desktop cloud will be able to solve the problem of resource waste caused by the above distributed desktop. For example, put these software and hardware into the data center of the enterprise, let the hardware and software pass the enterprise resource scheduling system, configure a reasonable configuration strategy, and let the users use it remotely. The use of software and hardware records and statistics, management and maintenance (such as air conditioning, electricity, etc.) level, data security, etc. have been revolutionized.

Various IT vendors have proposed similar "cloud" desktop solutions. For example, HP's CCI/VDI/SAM/RGS solution and Sun Ray. It involves technologies such as blade PCs, blade workstations, desktop virtualization, workstation virtualization, remote graphics compression and transport technologies, and thin clients.

The desktop cloud can be implemented step by step. For example, the desktop of the person with the highest data security requirements is first moved to the blade workstation or blade PC in the equipment room. The graphics workstation can also be moved to the data center computer room for remote use. The PC or workstation is replaced with a blade PC (or blade workstation) + thin client mode.

"Calculate the cloud"

Engineering simulation is getting more and more applications in enterprise R&D. How to solve larger-scale and more complex problems in a shorter time is still a headache for many R&D personnel. A single workstation often fails to meet the high-performance computing needs of such software, and more and more companies have considered or are considering building engineering simulation high-performance computing (HPC) systems. This type of system is not a simple cluster or minicomputer purchase, the following points are often overlooked:

1. Select the hardware architecture for the characteristics of the engineering simulation solver, including server type, interconnect structure, storage architecture, etc. For example, some applications require a large amount of memory, and a general thin node cluster cannot be solved. For example, some applications require low-latency interconnects for large-scale distributed parallel solving, and Gigabit Ethernet cannot achieve ideal parallel efficiency. For example, some softwares need to continuously read and write disk systems during operation. Ordinary NFS and other file systems become bottlenecks in system performance.

2. Select the computing resource scheduling software suitable for itself, integrate with the engineering simulation software “intimately”, configure the optimized computing scheduling strategy, and let the most important and urgent computing tasks get the software license and hardware computing resources first. Scheduling of hardware resources We believe that it is relatively mature at present, whether commercial or open source can be done very well, but for the scheduling of software licenses, due to the wide variety of application software and the license format, it is likely to be done. Some customization or code development work.

3. Interface between HPC system and engineering simulation software before and after processing, data management software, process management software, etc. Engineering simulation platform is an important sub-platform of enterprise R&D platform. How to embed the HPC system into the engineering simulation platform, allowing the user to call the HPC system in the most convenient and easy way to solve, also requires some customization or code development work.

If the HPC system can well solve the above performance optimization, resource (including license) scheduling and statistics, and pre- and post-processing and data and process management software integration, we believe that this HPC system can be called "computation within the enterprise" cloud".

"Storage Cloud"

Data is the carrier of information. Compared with the data of general enterprise information application, the data related to R&D informatization is larger (for example, large engineering simulation data in GB), and the performance requirements for data reading and writing are higher. More security requirements. The concept of data lifecycle management proposed by some IT vendors is worth considering. For example, in terms of performance, the relevant data being developed is placed on the best performing memory for developers to quickly read and write (for example, based on 10 Gigabit Ethernet, fiber or Infiniband storage), and the old R&D project data is placed in performance. Generally, but on a larger storage capacity (such as a large-capacity SATA disk array, using the iSCSI protocol, etc.). Permanently save data that has not been used for a long time with a CD-ROM library or a tape library.

Storage should also be integrated with desktop applications, computing applications, and web- and database-based applications. For example, the traditional practice of engineering simulation is to use the Windows graphics workstation to perform pre-processing, generate the input files needed for the solution, and then upload them to the Linux HPC system for parallel solution and download the results to the local. Not only does the uploading and downloading of files consume network bandwidth, but it also easily causes repeated use of disk space. If you place a graphics workstation or blade workstation in the data center, it is possible to share a parallel file system with the HPC system at high speed, whether it is Linux/Unix or Windows. Let the compute node and the pre- and post-processing nodes (workstations) read and write unified storage as if they were reading and writing local hard disks. This avoids the problem of file transfer, and unified storage makes it easy to perform disk quotas and data backup.

If we fully consider the performance, capacity, security, and integration and ease of use of applications such as desktop and computing, we can call it the “storage cloud” of enterprise R&D.

"Visualization Cloud"

In the development, if you need to use massive graphics processing, and a graphics workstation can not meet the requirements, such as insufficient memory, 3D graphics accelerator card processing capacity is not enough, it is likely to need a better server or cluster with more performance Block 3D accelerator card to process massive graphics data. The "visual cloud" server can also be applied to common floating point calculations without a "visualization" requirement as part of the "computation cloud."

We agree with some people in the IT industry that cloud computing is not a new concept. It has many similarities with concepts or methods such as Grid, ASP, SOA, and SaaS. The ultimate goal of cloud computing is to provide services for a certain business, such as the research and development information application described in this article. As a "private cloud" within the enterprise, there is no need to make a one-size-fits-all approach to catch up with the trend, and it can be gradually transformed according to the actual situation of the enterprise itself.

For small and medium-sized enterprises, it may not be realistic to build a private R&D information cloud computing environment based on their own strength. However, the demand for R&D by SMEs is the same as that of large enterprises. Therefore, we believe that it is necessary to build a public R&D information cloud computing platform so that ordinary SMEs can apply the cloud computing model at a lower cost. R&D tools that only large companies can consider. Especially for a certain industry (such as the mold industry), it is more practical to build such a public platform in a relatively dense industrial park.

We believe that in addition to the characteristics of “private cloud”, the R&D information-based public cloud computing platform should also consider the following characteristics:

1. Network bandwidth

The data blocks involved in R&D informatization tend to be large, and the transmission of data between the end user client and the data center server requires stable bandwidth, and remote graphics operations (such as 3D) require lower network latency. As the Internet infrastructure continues to expand, the bandwidth and quality of Internet access continue to increase, and costs continue to decrease. In particular, the rapid spread of 3G wireless communication will make mobile broadband a reality. The bandwidth of 3G is up to 7~8Mb/s, even exceeding the average household cable broadband (currently 2M). We have tested the use of 2M community broadband in the home, remotely operating a graphics workstation in the same company (the same city but separated by three administrative districts) for complex 3D model operations, thinking that the 3D model's scaling, rotation, translation, etc. are slightly delayed, but The operation is smooth, smooth and completely acceptable. If the public simulation platform is placed in the data center of the industrial park, the enterprise terminals in the industrial park will have fewer routes to the data center server, higher bandwidth, and better network quality. Therefore, we believe that the network bandwidth in the industrial park is sufficient to meet the software and hardware resources of hundreds of accounts simultaneously accessing the public R&D service platform.

2. Data security

R&D data is increasingly becoming one of the lifeblood of business development. How to ensure the security of data on the public service platform is one of the key considerations.

Generally, small and medium-sized manufacturing enterprises generally do not spend huge sums of money to purchase mature network security and data security related software, hardware, disaster recovery, backup and other equipment, means and systems, often in the professional data center is very different. So the data placed in the data center should be safer. The cost of building a private cloud for SMEs is also relatively high. The hard disk that stores data is often in the graphics workstation on the desk of the R&D staff. The flow of personnel will cause potential data loss. The too strict IT security management system may cause The crisis of personnel trust. Using the public service platform, the data is all in the data center, which can effectively avoid the above drawbacks, and make the important research and development data of the enterprise more secure.

In addition, from the technical and management systems, the public cloud computing service platform will adopt industry-leading security technologies and means, such as digital certificates, VPN, data encryption, data backup and recovery. It is even possible to consider the use of some banking or military security technologies. Before the platform is officially operated, it will receive industry-recognized security certification. Small and medium-sized enterprises use this platform to carry out research and development activities in information security.

3. Fee settlement

The R&D information-based public cloud computing platform will use a token-like metering method. By using the user's hardware resources (such as login and logout of the graphical workstation, the number of computing servers used and computing time), software resources (such as the number and time of software module license usage), storage resources (occupied storage space) The recorded statistics, combined with the metering algorithm of the adjustable token, calculate the token value for each use. Token also consumes other services on the user's platform, such as e-learning and technology transactions.

R&D informatized cloud computing, whether it is a private cloud of a large enterprise or a public cloud of a small and medium-sized enterprise, will gradually change the way in which information tools are used in research and development activities, enabling enterprises to lower costs, higher efficiency, and more Conduct research and development activities safely. Anshi Asia Pacific actively cooperates with well-known IT vendors at home and abroad to promote and apply the application of cloud computing in enterprise R&D informationization. Help users recommend, plan and implement the enterprise IT cloud computing basic IT architecture platform.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.