Use webalizer to analyze web logs

Source: Internet
Author: User
Use webalizer to analyze web logs-Linux Enterprise applications-Linux server applications. The following is a detailed description. Webalizer is an efficient and free web server log analysis program. The analysis results are saved as HTML files, which can be easily viewed on the web server. Many websites on the Internet use webalizer for web server log analysis.

Webalizer has the following features:
1. For programs written in C, it has a high operating efficiency. On a machine with a clock speed of 10000 MHz, webalizer can analyze records per second. Therefore, it takes only 15 seconds to analyze a 40 m log file.

2. webalizer supports standard Common Logfile formats. In addition, it also supports several combinations of log formats (Combined Logfile Format, this allows you to collect statistics on the customer's situation and the type of the customer's operating system. Now webalizer supports the wu-ftpd xferlog format and squid log file format.

3. Supports command line configuration and configuration files.

4. You can support multiple languages or perform Localization on your own.

5. Supports multiple platforms, such as UNIX, linux, NT, OS/2, and MacOS.
Installation: 1. Download webalizer.net from the official website of webalizer at http://www.mrunix.net/webalizer/. the latest version is webalizer-2.01-06-src.tgz.
2. First undo the source code package: tar xvzf webalizer-2.01-06-src.tgz
3. There is an lang directory in the generated Directory, which stores various language files, but only the Traditional Chinese version, you can convert it into simplified, or re-translate it yourself.
4. Enter the generated Directory:./configure make -- with-language = chinese
5. After compilation is successful, a webalizer executable file is generated. You can copy the file to the/usr/sbin/directory: cp webalizer/usr/sbin/
Then you can configure webalizer. Configuration: As mentioned above, you can configure webalizer through the command line or through the configuration file. In this article, we will introduce the use of command line parameters for configuration, for details about how to use the configuration file, refer to the README file. Can I execute webalizer? H. Obtain all command line parameters: Usage: webalizer [options] [log file]-h = print help information-v-V = print version information-d = print additional debugging information-F type = log format type. type = (clf | ftp | squid)-I = ignore historical files-p = retained (incremental Mode) -q = ignore message information-Q = ignore all information-Y = ignore country graphics-G = ignore hour statistics-H = ignore hour statistics-L = ignore color legend-l num = use the digital background line in the graph-m num = access timeout (seconds) -T = print time information-c file = Specify the configuration file-n name = used host name-o dir = result output directory-t name = Specify the host name-a name on the report question = hide user proxy name-r name = hide access link-s name = hide customer-u name = hide URL-x name = use file extension-P name = page type extension-I name = index alias-A num = show the first customer type-C num = show the first few countries-R num = show the first few links

-S num = display previous customers-U num = display previous URLs-e num = display previous access pages-E num = display previous non-existing pages- X = hide individual users-D name = use dns cache file-N num = number of DNS processes (0 = disable dns) assume that the web server host name is www.test.com, the website domain name is www.test.com, And the access log is/var/log/httpd/access_log, the webalizer analysis result is output to/var/www/html/log. Then we can create the following script/etc/rc. d/webalizer :#! /Bin/sh run =/usr/sbin/webalizer $ run-F clf-p-n ''-t' www .test.com '-o/var/www/html/log/ var/log/httpd/access_log description: -F clf indicates that the web log Format is standard Common Logfile Format-p, indicating that the incremental mode is used. This means that after each analysis, webalizer will generate a historical file so that the processed parts can not be analyzed in the next analysis. In this way, we can convert our log files in a short time, without worrying about the infinite increase of log files when the traffic is too large. -N "the specified server host name is null, which will make the output more beautiful. -O "www.test.com" specifies the title of the output result. /Var/log/httpd/access_log: Specify the log file and add: 01 1 *** root/etc/rc to/etc/crontab. d/webalizer: Execute the script at every day. Then run/etc/rc. d/init. d/crond reload to reload the crond service.
Test: run the following command:
#/Etc/rc. d/webalizer and visit http://www.test.com/log/in the browser to see the result of webalizeranalysis. Note: If you use a Chinese language file, but your linux does not support Chinese, the text in the generated image may be garbled.

[ This post was last edited by golshing]
Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.