The daily log analysis team is really sorry. We often need to provide statistics such as PV, UV, and independent IP addresses. C/C ++ and Java can all be used to write logs. The process is like this, first, read the file, scan it row by row, put the value that can be marked into the data structure, and record the final result. In fact, Linux itself has a very powerful text processing function, you can use shell + some text gadgets to get the result.
The access log file output by nngix is as follows:
Log File Code
- 192.168.1.166--119272312 [05/NOV/2011: 16: 06: 59 + 0800] "Get/index.html HTTP/1.1" 200 370 "http: // 192.168.1.201/"" Chrome/15.0.874.106 ""-"
- 192.168.1.166--119272312 [05/NOV/2011: 16: 06: 59 + 0800] "Get/poweredby.png HTTP/1.1" 200 3034 "http: // 192.168.1.201/"" Chrome/15.0.874.106 ""-"
- 192.168.1.177--1007071650 [05/NOV/2011: 16: 06: 59 + 0800] "Get/favicon. ico http/1.1 "404 3650"-"" Chrome/15.0.874.106 ""-"
- 192.168.1.178--58565468 [05/NOV/2011: 16: 17: 40 + 0800] "Get/HTTP/1.1" 200 3700 "-" "Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; Trident/5.0 )""-"
- 192.168.1.166--119272312 [05/NOV/2011: 16: 17: 40 + 0800] "Get/nginx-logo.png HTTP/1.1" 200 370 "http: // 192.168.1.201/"" Chrome/15.0.874.106 ""-"
PV is very simple. It is generally used to count the number of visits to a URL, for example, the number of visits to/index.html.
Shell code
- Grep "/index.html"/var/log/nginx/access. Log-C
UV: according to the user ID (column 4), we first need to intercept the string and use the cut command to separate the string with spaces. Then we can take column 4-F 4, next, we need to sort the weights here, and we need to use the uniq tool. The uniq speed is very fast, but based on the nearby weight sorting, the previous one and the next one will be the same, and the interval between them is different, so it won't work, in this case, the sort tool must be used to sort identifiers. After sorting, The uniq tool can be used to achieve the goal. Pipeline symbolic links are used between them, and WC-L is used to output statistics.
For example, we have accessed/index.html and the page UV:
Shell code
- Grep "/index.html"/var/log/nginx/access. log | cut-d ""-F 4 | sort | uniq | WC-l
Independent IP Address:
If we want to count the independent IP addresses of the entire site, we don't need to use grep to match the specific page, just use Cat output:
Shell code
- CAT/var/log/nginx/access. log | cut-D ""-F 1 | sort | uniq | WC-l
The basic statistical requirements have been fulfilled by the use of powerful awk in all trees :)
Share: