Large Web site architecture-1. Evolution of Architecture

Source: Internet
Author: User
Tags server memory website performance

1. First stage: Single Server architecture

This stage is our initial stage, for example, when we started a business, we just bought a cloud host.

At this stage, in order to save costs, we put all the applications, databases, files on this server.

Then, the cost of CPU or memory is also used in the development phase to take the least acceptable cost, and then start our server development path.

2. Phase II: Separation of application services and data Services

With the first launch of the site, if our site is operating well, after this should gradually accumulate popularity, business

will also be further developed with the development of popular sentiment.

At this time, 1 servers obviously can not meet the demand, more and more user access leads to poor performance, at the same time, the data is gradually

More, we consider adding hard drives.

At this point, the first thing to think about is: separating applications from data


As a result, the site architecture becomes 3 servers: Application Server (WEB server), file server (Resource server), database server

The configuration requirements for 3 servers are not the same:

Web Server: Requires processing a lot of business and requires a faster CPU.

database Server: Need to quickly retrieve data and hold more data, the need for larger and faster hard disk , hard disk is the best solid-state drive is the main.

Resource Server: Need to store user uploaded files, such as photos, videos and so on, need a larger hard disk , hard disk, but ordinary hard disk.

3. Phase III: Improve website performance with caching

The website business follows 28 principles, and 80% of the business is focused on 20% of the data.

Therefore, if this small amount of data is cached, I can anshao the pressure of the database access.

In the initial phase, some local server memory caches can be used, and as the business expands,

Can increase the remote distributed cache server, apply some mature frameworks, such as: Redis

4. Phase four: Application server clusters increase concurrency processing power

Cluster has obviously become a modern web site to deal with high concurrency, massive data of conventional means.

When 1 servers are running low on performance, the first thing we should consider is not to replace a strong server, but to increase the server.

At this point, we should introduce a load-balanced dispatch server in our architecture, and then request a load-balanced server for distribution to the individual servers located on the cluster.

Application Server.

Large Web site architecture-1. Evolution of Architecture

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.