Aliyun server configuration and performance optimization

Source: Internet
Author: User
Tags config ini install perl aliyun

Aliyun I asked a lot of webmaster said very good, whether it is faster or from performance than other hosts better, but Aliyun is expensive so little, the following small set to give you introduce Aliyun server configuration and performance optimization

Replace the Aliyun server for a while, compared to my previous virtual host is not a grade. For example, I was in the virtual host was put on the backdoor file, I put the files on the virtual host to the cloud server Shihuiun server immediately alarm there are loopholes, the most important or controllable.

Some of the equivalent configuration records:

First, configure Nginx processing log

1, set Nginx log format (data/server/nginx/conf/nginx.conf)

Log_format Main ' $remote _addr-$remote _user [$time _local] ' $request '

' $status $body _bytes_sent ' $http _referer '

' $http _user_agent ', ' $http _x_forwarded_for ';

2, configure Nginx log truncated by day

2.1, log Logrotate

/data/log/nginx/access/*.log {

Daily

#dateext

Rotate 1

Missingok

Ifempty

Nocompress

Olddir/data/log/nginx/access/days

Sharedscripts

Postrotate

[!-f/data/server/nginx/logs/nginx.pid] | | KILL-USR1 ' Cat/data/server/nginx/logs/nginx.pid '

Endscript

}

The file name is nginx and stored in the/ETC/LOGROTATE.D directory. Logrotate Way than MV has a lot of advantages, Google itself. Don't say much.

2.2, write the log processing script

#!/bin/bash

/usr/sbin/logrotate-vf/etc/logrotate.d/nginx

time=$ (date-d "Yesterday" + "%y-%m-%d")

Cd/data/log/nginx/access/days

For I in $ (ls./| grep) ^ (. *). [ [:d Igit:]] $")

Todo

MV ${i}/$ (Echo ${i}|sed-n ' s/^ (. *). ( [[:d Igit:]]) $/1/p ')-$ (Echo $time)

Done

For I in $ (ls./| grep) ^ (. *)-([[:d igit:]-]+) $]

Todo

Tar zcvf ${i}.tar.gz./${i}

/bin/rm-rf./${i}

Done

find/data/log/nginx/access/*-name "*.tar.gz"-mtime 30-type f-exec/bin/rm-rf {};

Simple description: The implementation of logrotate, the log to the/var/log/nginx/days directory, the end of the file name plus yesterday's date, packaged into tar.gz format, more than 30 days of the file automatically deleted

2.3. Add timed tasks to crontab

Crontab-e

1 0 * * */data/shs/nginxlogcut.sh >/dev/null 2>&1

Second, log analysis

1, Awstats Installation

If the Yum has been added Rpmforge source is simpler, direct yum install can, if not:

Yum Install Perl-libwww-perl.noarch

wget http://prdownloads.sourceforge.net/awstats/awstats-7.0-1.noarch.rpm

RPM-IVH awstats-7.0-1.noarch.rpm

cd/usr/local/awstats/tools/

./awstats_configure.p

2, Awstats Configuration

Because it is often necessary to analyze the logs of multiple sites, in order to simplify operations

CP awstats.model.conf common.conf

Modify several of these options:

#vi common.conf

Efficiency of dnslookup=0 #关闭DNSLookup可以提升99%

Dirdata= "/data/awstats/data" #指定统计数据存放的目录

Skipfiles= "Regex[^*.gif] regex[^*.jpg] regex[^*.css] regex[^*.js]" #不分析图片/style sheets and JS files

notpagelist= "CSS JS class gif jpg jpeg PNG BMP ico RSS XML swf" #非页面文件

Loadplugin= "ToolTips" #显示每个统计项目的提示

Loadplugin= "Decodeutfkeys" #UTF8编码问题

Loadplugin= "Qqhostinfo" #加装的利用纯真IP库判断方可归属地

Modify Limitflush speed up processing, effectively increase the speed of statistics (reduce disk IO), and avoid the fact that the monthly summary of data results in too large a statistical file

#vi/usr/local/awstats/wwwroot/cgi-bin/awstats.pl

$LIMITFLUSH =

50000; # Nb of records in data arrays after how we need to flush data on disk

3, my statistical examples

#vi/etc/awstats/awstats.www.geekso.com.conf

Include "Common.conf"

Logformat=1 #指定日志格式, Xnix for 1,iis 2

logfile= "zcat/data/log/nginx/access/days/geekso.log-%yyyy-24-%mm-24-%dd-24.tar.gz |"

#LogFile = "/var/log/iislogs/v/ex%yy-24%mm-24%dd-24.log"

#LogFile = "/var/log/nginx/crsay.geekso.log-%yyyy-24-%mm-24-%dd-24"

Sitedomain= "Www.geekso.com"

Hostaliases= "Geekso.com"

Defaultfile= "Index.html"

Dirdata= "/data/www/awstats/data"

Allowaccessfromwebtoauthenticatedusersonly=1

Allowaccessfromwebtofollowingauthenticatedusers= "Geekso"

3. Write Log Analysis Script

#!/bin/bash

Ilog=/data/www/awstats/logcron/awstats_cron.log

awscripts=/usr/local/awstats/wwwroot/cgi-bin/awstats.pl

awpages=/usr/local/awstats/tools/awstats_buildstaticpages.pl

echo "" >> $ilog

Echo Starting Awstats on: $ (date) >> $ilog

Perl $awpages-update-config=www.geekso.com-lang=cn-dir=/data/www/awstats/geekso-awstatsprog= $awscripts

#perl $awscripts-update-config=site2

#perl $awscripts-update-config=site2-databasebreak=day

#perl $awscripts-update-config=site3

#perl $awscripts-update-config=site3-databasebreak=day

echo ending Awstats on: $ (date) >> $ilog

echo "------------------------------" >> $ilog

4. Add timed tasks to crontab

#crontab-E

1 1 * * */data/shs/awstats.sh >/dev/null 2>&1

Report:

Awstats Statistic Indicator Description:

Visitors: According to the visitors do not repeat the IP statistics, an IP represents a visitor;

Number of visits: A visitor may visit several times within 1 days (e.g., once in the morning, once in the afternoon), so the number of visitors will be counted for a certain period of time (for example: 1 hours), the number of IP numbers not duplicated;

Number of pages: Excluding pictures, CSS, JavaScript files, such as the total number of page access, but if a page uses more than one frame, each frame is counted as a page request;

Number of files: The total number of file requests from the browser client, including pictures, Css,javascript, etc., users request a page is, if the page contains pictures, etc., so the server will issue multiple file requests, the number of files is generally far larger than the number of documents;

Bytes: The total data flow to the client;

Data from REFERER: the reference (REFERER) field in the log, which records the address before accessing the corresponding Web page, so if the user clicks into the site through search engine results, there will be a user's query address in the corresponding search engine. This address can be resolved by the user query using the keyword extraction.

Iii. preventing cross-Directory Access

Your Server www directory has a number of sites, when a certain site has been attacked, if you do not do cross directory access, then he will be able to see all the files in the WWW directory, and even the system directory files on the server.

The procedure is in the Nginx under the Conf directory for each site's conf set Open_basedir access directory

Fastcgi_param php_value "open_basedir= $document _root:/tmp/:/data/www/geekso/";

Iv. Disabling Dangerous functions

The php.ini recommended functions that are prohibited are as follows:

Disable_functions = Pcntl_alarm, Pcntl_fork, Pcntl_waitpid, pcntl_wait, pcntl_wifexited, pcntl_wifstopped, Pcntl_ Wifsignaled, Pcntl_wexitstatus, Pcntl_wtermsig, Pcntl_wstopsig, Pcntl_signal, Pcntl_signal_dispatch, Pcntl_get_last_ Error, Pcntl_strerror, Pcntl_sigprocmask, Pcntl_sigwaitinfo, pcntl_sigtimedwait, Pcntl_exec, pcntl_getpriority, Pcntl _setpriority, eval, Popen, PassThru, exec, System, shell_exec, Proc_open, Proc_get_status, Chroot, Chgrp, Chown, Ini_alter , Ini_restore, DL, Pfsockopen, Openlog, Syslog, Readlink, Symlink, Popepassthru, Stream_socket_server, Fsocket, chdir

Five, PHP Zend Opcache installation

Opcache Package Download Address: Http://pecl.php.net/package/ZendOpcache

1, wget http://pecl.php.net/get/zendopcache-7.0.2.tgz

Tar zxvf zendopcache-7.0.2.tgz

CD zendopcache-7.0.2

/usr/local/php/bin/phpize

./configure--with-php-config=/usr/local/php/bin/php-config

Make

Make install

2, modify the php.ini configuration file, add at the end of the configuration file:

[Opcache]

zend_extension= "/usr/local/php/lib/php/extensions/no-debug-non-zts-20090626/opcache.so"

opcache.memory_consumption=128

Opcache.interned_strings_buffer=8

opcache.max_accelerated_files=4000

Opcache.revalidate_freq=60

Opcache.fast_shutdown=1

Opcache.enable_cli=1

Opcache.enable=1

After saving, restart the service. To see if phpinfo loaded the Zendopcache

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.