Load Balancing Google App Engine with nginx

Source: Internet
Author: User
Tags vps nginx server

On the weekend, I was wondering, "can I use nginx to load balance multiple Google app enine (Gae) applications ?" It's interesting to think about it. Let's try it out, but this is the first time I 've used nginx for load balancing. It's just a bit of fun, and I don't have that big traffic application yet.

Introduction to nginx, which can be found in Baidu encyclopedia, Wikipedia, and the official website, zhang Yu wrote a very detailed tutorial on building a web server with nginx-nginx + PhP (FastCGI), which is 10 times better than Apache. thanks to this tutorial, I successfully set up a Web server with nginx on VPs. I would like to express my thanks.

Server Load balancer sounds complicated and profound. After you try it yourself, you will find that the configuration on nginx is simple.

Now, let's go to our system. The architecture of the entire system is as follows:

Step 1. Configure the nginx proxy to access the Gae Application

First, we need to configure multiple proxies (you can set the number of proxies for each Gae application quota). It is not feasible to directly access the Gae application through an IP address, because Google's server cannot identify the application, we need to use nginx as a proxy. The configuration of each proxy is as follows:

Server {Listen 8081; Location/{proxy_pass http://app1.appspot.com; proxy_set_header host "app1.appspot.com"; proxy_set_header X-real-IP $ remote_addr; proxy_set_header X-forwarded-for $ proxy_add_x_forwarded_for ;}}....

In the above configuration, setting the Host header is a key configuration. It should be set as the original domain name of your Gae application, usually xxx.appspot.com. After you configure nginx again, you can access your Gae application through port 8081.

Step 2. configure Server Load balancer

Nginx Server Load balancer is implemented by adding the upstream {..} configuration section. I configured the simplest load balancing method:

 
Upstream backend {server 127.0.0.1: 8081 ;...}

As you can see, you can configure multiple server fields above, because multiple proxies are configured before, and their respective ports are different. There are two types of server:

1. the IP address of each server is different. At first, I assigned a domain name to each proxy, and then tried to identify the server through different domain names. This is still to be studied. If you know more about it, please kindly advise.

2. The port number of each server is different. This is the method we used above.

In addition to configuring address parameters, server also has other important parameters, such as setting the weight of each server and whether it is a backup server. For more details, see the official wiki. You can also try multiple configurations to do experiments ~~

Step 3. Configure the overall portal for the Application

When other configurations are ready, there is only one entry point for the application, that is, the domain name for accessing the system. My configuration is as follows:

Server {Listen 80; SERVER_NAME G. ooq. me; Location/{proxy_pass http: // backend; proxy_set_header host $ host; proxy_set_header X-real-IP $ remote_addr; proxy_set_header X-forwarded-for $ response ;}}

All the configurations here are complete. Reload the nginx configuration file and the system starts. To see the effect, click here (expired ).

Summary

Building this system has many advantages:

1. You can use multiple Gae to run one application for free.

2. A good lab environment for learning nginx Server Load balancer configurations.

3. Never worry about the Gae being blocked, as long as your VPS can be accessed.

Disadvantages also exist:

1. for the moment, enterprise-level applications are impossible. Even if there is no need to do so, Gae is a cloud computing application. Google has already implemented load balancing, so it can only be used for personal purposes, making a blog or something is a good choice.

2. bandwidth problem, my VPs is in the United States, ping the Gae server as long as 6-8 ms, it is quite satisfactory, if your VPS is in China, the estimated speed may be a problem.

3. database sharing is a complicated problem. If each Gae application uses a separate database, it is not a system. We are looking forward to a good solution.

 

Independent blog links: http://lloydsheng.com/2010/05/google-app-engine-load-balancing-with-nginx.html

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.