Millions of O & M experience 4: server selection and deployment

Source: Internet
Author: User

I used to blindly choose servers. What should I do when the traffic is too high for the server to handle? My idea at that time was to add configuration, 4-core to 8-core, 8-core to 16-core, the memory is also increased, and 4 GB to 8 GB to 16 GB, why don't I add a server? The problem is that the efficiency of server configuration improvement is the same. Later, I realized that this idea was wrong or stuck in my PC thinking.

I found that adding server configurations does not improve my performance. I do not know much about servers and operating systems. I personally think the reasons are as follows:

First of all, most of the software is not optimized for multi-core CPUs. Even optimized, it cannot fully utilize CPU resources. Software or System Scheduling for multi-core CPU resources should not be perfect.

Second, I guess that the bus bandwidth between the CPU and memory, and between the CPU and the hard disk should be fixed, just like the room is very large, but the door is very small, and we still need to queue in one by one.

Third, generally, website programs do not have high CPU requirements. The main performance bottleneck of websites is hard disk I/O. Because the hard disk I/O speed is fixed, increasing the CPU does not improve the hard disk I/O performance.

In my opinion, changing an 8-core 8 GB memory server to two 4-Core 4 GB servers will be much better, that is, 1 + 1> 2. According to the working principle of the barrel, hard Disk IO is the short board.

Why? Because the two servers are equivalent to getting twice the CPU bus and twice the hard disk I/O, it is like opening two doors in a room, guests can enter the room much faster, and the CPU and memory resources can be fully utilized.

Two servers can be used for load balancing, or the website can be divided into two applications and placed on different servers. Normally, my website will be divided into PC and mobile versions, so I will put the PC version on one server and the mobile version on another server, and the Code on both sides will not affect each other, then, the two nginx servers are used to reverse proxy each other. In addition, with Server Load balancer, no matter which server has a problem, it can be quickly located.

My design philosophy for the website architecture is to split large websites into independent applications, such as user center, registration and login, PC end, and mobile phone end. The actual situation is analyzed, then place different applications on different servers, and other applications are connected to the corresponding server through reverse proxy to form a network, no matter which server is accessed, you can access other servers through this server.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.