Nginx reverse proxy script

Source: Internet
Author: User
Tags nginx reverse proxy

I. Overview

The Reverse Proxy method is to use a Proxy server to accept connection requests from the Internet and then forward the requests to the server on the internal network; return the result obtained from the server to the Client Requesting connection from the Internet. The proxy server is displayed as a server.

Generally, the proxy server is only used to proxy internal network connection requests to the Internet. The client must specify the proxy server and send the http requests that were originally sent directly to the Web server to the proxy server. When a proxy server can proxy hosts on the external network and access the internal network, this proxy service is called reverse proxy service.

Figure 1 Basic Principle of Reverse Proxy Server

Ii. Working Principle of Reverse Proxy Server

A reverse proxy server can be used as a content server or as a Load balancer of a content server cluster.

1. As a substitute for the Content Server

If your content server has sensitive information that must be kept secure, such as the credit card number database, you can set a proxy server outside the firewall as a proxy for the content server. When an external client attempts to access the content server, it will send it to the proxy server. The actual content is on the content server and is protected by security inside the firewall. The proxy server is located outside the firewall and looks like a content server to the client.

When the client sends a request to the site, the request is forwarded to the proxy server. Then, the proxy server sends client requests to the content server through a specific channel in the firewall. The content server then returns the result to the proxy server through this channel. The proxy server sends the retrieved information to the client, as if the proxy server is the actual content server (see figure 2 ). If the content server returns an error message, the proxy server first intercepts the message, changes any URL listed in the header, and then sends the message to the client. This prevents external clients from getting the Redirection URL of the internal content server.

In this way, the proxy server provides another barrier between the security database and possible malicious attacks. In contrast to the situation where you have the right to access the entire database, even if you are lucky enough to attack successfully, attackers are limited to accessing the information involved in a single transaction at best. Unauthorized users cannot access the real content server because the firewall channel only allows the proxy server to access the server.

 

Figure 2 reverse proxy server as a content server proxy

You can configure a Firewall Router so that it can only allow specific servers on a specific port (in this example, the proxy server on the port allocated to it) to access through the firewall, other machines are not allowed in or out.

2. Server Load balancer as a content server

You can use multiple proxy servers in an organization to balance the network load between Web servers. In this model, you can use the high-speed cache feature of the proxy server to create a server pool for load balancing. The proxy server can be on any side of the firewall. If the Web server receives a large number of requests every day, you can use the proxy server to share the load of the Web server and improve network access efficiency.

The proxy server acts as the intermediate mediator for client requests sent to the Real Server. The proxy server saves the requested documents to the cache. If there is more than one proxy server, DNS can use the "loop multiplexing method" to select its IP address and randomly select a route for the request. The client uses the same URL each time, but the route used by the request may go through different proxy servers each time.

You can use multiple proxy servers to process requests to a high-volume content server. The advantage of this is that the content server can handle higher loads and be more efficient than when it is working independently. During the initial startup, the proxy server retrieves documents from the content server for the first time. After that, the number of requests to the content server is greatly reduced.

The script is as follows:

Nginx-1.2.0.tar.gz source package version

#! /Bin/bash

NGINXDIR =/usr/local/nginx

NGINX = nginx-1.2.0

Tar.tar.gz

NGINXMBER = 81

NGINXUN = 82

NGINXPROT = 'lsof-I: 80 | awk 'nr = 2 {print $1 }''

Iptables-F

Setenforce 0

Netstat-nl | grep: 80

If [$? -Eq 0]; then

Pkill-9 $ NGINXPROT

Else

Echo "80 prot alredy release"

Fi

For WARP in {gcc, gcc-c ++, pcre, pcre-devel, openssl, openssl-devel}

 

Do

Rpm-q $ WARP

If [$? ! = 0]; then

Yum-y install $ WARP

Else

Echo "$ WARP already install"

Fi

Done

Sleep 3

 

Echo "---------------------- install nginx -------------------"

 

Id nginx &>/dev/null

If [$? ! = 0]; then

Useradd nginx

Else

Echo "user nginx already be"

Fi

 

Cd/root

Tar-zxf/root/$ NGINX $ TAR

Cd/root/$ NGINX

./Configure -- prefix = $ NGINXDIR -- user = nginx \

-- Group = nginx -- with-http_ssl_module \ secure transfer

-- With-http_stub_status_module \ nginx monitoring module

Without-http_rewrite_module

If [$? ! = 0]; then

Echo "nginx configure failed"

Exit $ NGINXMBER

Else

Make & make install

Fi

Sed-I '26a upstream web {'$ NGINXDIR/conf/nginx. conf

Sed-I '27a server 200.1.1.10: 80; '$ NGINXDIR/conf/nginx. conf

Sed-I '28a server 200.1.1.20: 80; '$ NGINXDIR/conf/nginx. conf

Sed-I '29a} '$ NGINXDIR/conf/nginx. conf

Sed-I '30a server {'$ NGINXDIR/conf/nginx. conf

Sed-I '31a listen 80; '$ NGINXDIR/conf/nginx. conf

Sed-I '32a server_name www.tarena.com; '$ NGINXDIR/conf/nginx. conf

Sed-I '33a location/{'$ NGINXDIR/conf/nginx. conf

Sed-I '34a root html; '$ NGINXDIR/conf/nginx. conf

Sed-I '35a index index.html; '$ NGINXDIR/conf/nginx. conf

Sed-I '36a proxy_pass http: // web; '$ NGINXDIR/conf/nginx. conf

Sed-I '37a} '$ NGINXDIR/conf/nginx. conf

Sed-I '38a} '$ NGINXDIR/conf/nginx. conf

 

Ln-s $ NGINXDIR/sbin/*/usr/sbin/

Nginx-t

If [$? -Eq 0]; then

Nginx

Else

Echo "nginx configure file failed"

Fi

 

Nginx details: click here
Nginx: click here

Deployment of Nginx + MySQL + PHP in CentOS 6.2

Build a WEB server using Nginx

Build a Web server based on Linux6.3 + Nginx1.2 + PHP5 + MySQL5.5

Performance Tuning for Nginx in CentOS 6.3

Configure Nginx to load the ngx_pagespeed module in CentOS 6.3

Install and configure Nginx + Pcre + php-fpm in CentOS 6.4

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.