Website log processing tool Awstats: Process Multiple Apache logs

Source: Internet
Author: User
Tags apache log
Article Title: website log processing tool Awstats: Processing Multiple Apache logs. Linux is a technology channel of the IT lab in China. Includes basic categories such as desktop applications, Linux system management, kernel research, embedded systems, and open source.

In linux, Awstats is a popular website log processing tool. It is easier to install and configure Awstats in a linux environment, but there is still a lot of work to be done to process Multiple Apache logs (such as the web Cluster Environment) at the same time.

Platform environment
1. Three servers, two web servers running Apache, and one Awstats server dedicated to log processing.
2. Running platform: all are redhat AS 4.

Design Concept
The log processing server regularly obtains log files from two Apache servers every day, decompress the obtained compressed files, merge the two separate log files into one, and generate reports using Awstats. Next we will deploy the service based on this idea.

I. Apache log generation and Processing

1. Apache log generation:
This goal is achieved by modifying the Apache configuration file httpd. conf. The following section describes how to modify the log records of the configuration file of an Apache server:
SetEnvIf Request_URI \. gif image-request
SetEnvIf Request_URI \. jpg image-request
SetEnvIf Request_URI \. png image-request
SetEnvIf Request_URI \. js image-request
SetEnvIf Request_URI \. css image-request
SetEnvIf Request_URI \. swf image-request
ErrorLog/var/log/web/sery.com-error_log
CustomLog "|/opt/Apache2/bin/rotatelogs/var/log/web/sery.com-access % Y % m % d. log. % H 28800 480" combined env =! Image-request
Here, we will briefly describe the significance of the above projects. SetEnvIf Request_URI: environment variable "image-request ", "CustomLog" |/opt/Apache2/bin/rotatelogs/var/log/web/sery.com-access % Y % m % d. log. % H 28800 480 "combined env =! Image-request
"Indicates that logs generated by image requests are not recorded, and logs are separated by the Apache Log rotation tool rotatelogs. The files are named by year, month, day, and hour, in this way, it is very convenient to use scripts to process logs. This CustomLog is a bit special. Do not drop the "|" and other symbols in front of the command. By running the Apache service, the log file sery.com-access20071120.log. 00 will be generated in the directory/var/log/web.

 

The log rotation function allows you to view two log files.

2. Log backup and compression:
Enter the directory where the log is located, merge the logs in different time periods of the same day into one file, compress the logs, and move them to another location:/var/log/weblog-backup. Why store it in another location and compress it? The main reason is to save time during transmission. Of course, we cannot manually perform this operation every day. Naturally, we have to use shell scripts to do this. The script content is as follows:
#! /Bin/sh
Lastlogdate = 'date "+ % Y % m % d"-d yesterday'

Touch/var/log/web/sery.com-access $ lastlogdate. log
For I in/var/log/web/sery.com-access $ lastlogdate. log .*;
Do cat $ I>/var/log/web/sery.com-access $ lastlogdate. log
Rm-f $ I
Done

Touch/var/log/web/sery.com-access $ lastlogdate. log
Gzip/var/log/web/sery.com-access $ lastlogdate. log

If [-f/var/log/web/sery.com-access#lastlogdate.log.gz];
Then
Mv/var/log/web/sery.com-access1_lastlogdate.log.gz/var/log/weblog-backup/
Fi

Rm-rf 'Find/var/log/weblog-backup/-atime 7'
Name the script merge_log.sh in the directory/usr/local/bin and grant the execution permission. Then add the task to the automatic task so that it runs automatically once a day. Run crontab-e to add the following rows:
5 0 ***/usr/local/bin/merge_log.sh
To check whether the script is correct, run the script merge_log.sh at least once to check whether the compressed file is generated in the directory/var/log/weblog_backup. If a file like a sery.com-access20071119.log.gz is generated as expected, it indicates that the script works correctly according to our intent.

3. Awstats is allowed to obtain Apache log files:
Deploy the ftp service, create an ftp user, and locate the ftp user directory to "/var/log/weblog-backup", that is, the Directory of the compressed file generated in step 1. Use the following command to generate an ftp user and specify a directory:
[Root @ www1 ~] # Useradd-d/var/log/weblog-backup-s/sbin/nologin sery
[Root @ www1 ~] # Passwd sery
[Root @ www1 ~] # Chmod 755-R/var/log/weblog-backup
Because vsftpd configuration is very easy, I will not talk about it here. Enable the ftp service. Use the user you just created to test whether the file "/var/log/weblog_back" can be seen.

In this step, the operation we need to perform on the Apache server is over. You only need to repeat the preceding operations on Multiple Apache servers. Of course, to adapt to your own environment, Please modify the relevant directories and files on your own.

[1] [2] [3] [4] Next page

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.