1. See how many IP accesses are available on the day:
awk ' {print '} ' log_file|sort|uniq|wc–l
2. View the number of times a page has been accessed:
grep "/index.php" Log_file | Wc–l
3. See how many pages each IP visited:
awk ' {++s[$1]} END {for (a in S) print A,s[a]} ' log_file
4, the number of pages per IP access from small to large sort:
awk ' {++s[$1]} END {for (a in S) print S[a],a} ' log_file | Sort–n
5. See which pages a certain IP has accessed:
grep ^111.111.111.111 log_file| awk ' {print $1,$7} '
6, remove the search engine statistics on the day of the page:
awk ' {print $12,$1} ' log_file | grep ^\ "Mozilla | awk ' {print $} ' |sort | Uniq | Wc–l
7. View June 21, 2009 14 O'Clock How many IP accesses are in this one-hour period:
awk ' {print $4,$1} ' log_file | grep 21/jun/2009:14 | awk ' {print $} ' | Sort | Uniq | Wc–l
8. View access to the top 10 IP addresses
awk ' {print '} ' |sort|uniq-c|sort-nr |head-10 log_file
9. Most frequently accessed files or pages
Cat Log_file |awk ' {print $11} ' |sort|uniq-c|sort-nr
10. Through the number of sub-domain access, based on referer to calculate, slightly not allowed
Cat Log_file | awk ' {print $11} ' | Sed-e ' s/http:\/\///'-e ' s/\/.*//' | Sort | uniq-c | Sort-rn | Head-20
11. List several files with the largest transfer size
Cat Log_file |awk ' ($7~/\.php/) {print $ "" $ "" $4 "" $7} ' |sort-nr|head-100
12. List pages with output greater than 200000byte (approx. 200kb) and the number of corresponding page occurrences
Cat Log_file |awk ' ($ > 200000 && $7~/\.php/) {print $7} ' |sort-n|uniq-c|sort-nr|head-100
13. If the last column of the log records the paging file transfer time, there are the most time-consuming pages listed to the client
Cat Log_file |awk ' ($7~/\.php/) {print $NF "" $ "" $4 "" $7} ' |sort-nr|head-100
14. List the most time-consuming pages (more than 60 seconds) and the number of corresponding page occurrences
Cat Log_file |awk ' ($NF > && $7~/\.php/) {print $7} ' |sort-n|uniq-c|sort-nr|head-100
15. List files that have been transmitted for longer than 30 seconds
Cat Log_file |awk ' ($NF >) {print $7} ' |sort-n|uniq-c|sort-nr|head-20
16. List the number of runs per process for the current server, in reverse order
Ps-ef | Awk-f ' {print $8 "" $9} ' |sort | Uniq-c |sort-nr |head-20
17. The most frequently joined IP addresses in the current Web server
Netstat-ntu |awk ' {print $} ' |sort | uniq-c| Sort-nr
18. See more than 100 IPs in the log
Cat log_file |cut-d '-F 1 |sort |uniq-c | awk ' {if ($ >) print $} ' |sort-nr |less
19. View the most recently accessed files
Cat Log_file |tail-10000|awk ' {print $7} ' |sort|uniq-c|sort-nr|less
20. View pages with more than 100 visits in the log
Cat Log_file | Cut-d '-F 7 | Sort |uniq-c | awk ' {if (>) print $} ' | Less
21. Statistics a URL, the number of visits per day
Cat Log_file | grep ' 12/aug/2009 ' |grep '/images/index/e1.gif ' |wc|awk ' {print '} '
22. Most visited pages in the first five days
Cat Log_file | awk ' {print $7} ' |uniq-c |sort-n-r|head-20
23. See what the IP is doing in the log
Cat Log_file | grep 219.239.157.240 | awk ' {print ' \ t ' $7} ' | Sort | uniq-c | Sort-nr | Less
24. List the most time-consuming pages (more than 60 seconds)
#cat log_file |awk ' ($NF > && $7~/\.php/) {print $7} ' |sort-n|uniq-c|sort-nr|head-100
25. Website clicks per day
Cat log_file |grep ' 12/nov/2011 ' | grep "******.jsp" |wc|awk ' {print $} ' |uniq
26. How many independent IPs have access to the site
Cat log_file |grep ' 12/aug/2011 ' |grep "******" |wc|awk ' {print $} ' |uniq
View the number of IP connections in the day by log, filter duplicate
Cat Log_file | grep "20/oct/2008" | awk ' {print $} ' | Sort | uniq-c | Sort–nr
27. Sniff 80-port access with tcpdump to see who is the tallest
Tcpdump-i ETH0-TNN DST Port 80-c 1000 | Awk-f "." ' {print $ '. $ "." $ "." $4} ' | Sort | uniq-c | Sort–nr
28. View the number of IP connections for a time period
grep "2006:0[7-8]" log_file | awk ' {print $} ' | Sort | uniq-c| Sort-nr | Wc-l
Summary of common commands for Linux log analysis