Use Shell to view Apache IP traffic under Linux

Source: Internet
Author: User

1. View TCP connection Status

Netstat-nat |awk ' {print $6} ' |sort|uniq-c|sort-rn

Netstat-n | awk '/^tcp/{++s[$NF]}; END {for (a in S) print A, s[a]} '

Netstat-n | awk '/^tcp/{++state[$NF]}; END {for (key in) print key, "T", State[key]} '

Netstat-n | awk '/^tcp/{++arr[$NF]}; END {for (k in arr) print K, "T", arr[k]} '

Netstat-n |awk '/^tcp/{print $NF} ' |sort|uniq-c|sort-rn

Netstat-ant | awk ' {print $NF} ' | Grep-v ' [A-z] | Sort | Uniq-c

Netstat-ant|awk '/ip:80/{split ($5,ip, ":"); ++s[ip[1]]}end{for (A in S) print S[a],a} ' |sort-n

Netstat-ant|awk '/:80/{split ($5,ip, ":"); ++s[ip[1]]}end{for (A in S) print S[a],a} ' |sort-rn|head-n 10

awk ' begin{printf ("Http_codetcount_numn")}{count[$10]++}end{for (A in COUNT) printf a "tt" Count[a] "n"} '

2. Find the number of requests 20 IP (commonly used to find the source of attack):

Netstat-anlp|grep 80|grep Tcp|awk ' {print $} ' |awk-f: ' {print '} ' |sort|uniq-c|sort-nr|head-n20

Netstat-ant |awk '/:80/{split ($5,ip, ":"); ++a[ip[1]]}end{for (i in A) print A[i],i} ' |sort-rn|head-n20

3. Sniff 80-port access with tcpdump to see who is the tallest

Tcpdump-i ETH0-TNN DST Port 80-c 1000 | Awk-f "." ' {print $1″. ' $2″. " $3″. " $4} ' | Sort | uniq-c | Sort-nr |head-20

4. Find more time_wait connections

Netstat-n|grep Time_wait|awk ' {print $} ' |sort|uniq-c|sort-rn|head-n20

5. Check for more SYN connections

Netstat-an | grep SYN | awk ' {print $} ' | Awk-f: ' {print $} ' | Sort | uniq-c | Sort-nr | More

6. Depending on the port column process

NETSTAT-NTLP | grep 80 | awk ' {print $7} ' | Cut-d/-F1

Website Log analysis (Apache):
1. Get access to the top 10 IP addresses

Cat Access.log|awk ' {print '} ' |sort|uniq-c|sort-nr|head-10

Cat Access.log|awk ' {counts[$ (11)]+=1}; END {for (URL in counts) print Counts[url], url} '

2. Most visited files or pages, take the top 20 and count all Access IP

Cat Access.log|awk ' {print $11} ' |sort|uniq-c|sort-nr|head-20

awk ' {print '} ' access.log |sort-n-R |uniq-c|wc-l

3. List the maximum number of EXE files to be transmitted (commonly used when analyzing the download station)

Cat Access.log |awk ' ($7~/.exe/) {print $ "" $ "" $4 "" $7} ' |sort-nr|head-20

4. list exe files with output greater than 200000byte (approx. 200kb) and the number of corresponding file occurrences

Cat Access.log |awk ' ($ > 200000 && $7~/.exe/) {print $7} ' |sort-n|uniq-c|sort-nr|head-100

5. If the last column of the log records the paging file transfer time, there are the most time-consuming pages listed to the client

Cat Access.log |awk ' ($7~/.php/) {print $NF "" $ "" $4 "" $7} ' |sort-nr|head-100

6. List the most time-consuming pages (more than 60 seconds) and the number of corresponding page occurrences

Cat Access.log |awk ' ($NF > && $7~/.php/) {print $7} ' |sort-n|uniq-c|sort-nr|head-100

7. List files that have been transmitted for longer than 30 seconds

Cat Access.log |awk ' ($NF >) {print $7} ' |sort-n|uniq-c|sort-nr|head-20

8. Statistics website Traffic (G)

Cat Access.log |awk ' {sum+=$10} END {print sum/1024/1024/1024} '

9. Statistics 404 of the Connection

awk ' ($9 ~/404/) ' Access.log | awk ' {print $9,$7} ' | Sort

10. Statistics HTTP status.

Cat Access.log |awk ' {counts[$ (9)]+=1}; END {for (code in counts) print code, counts} '

Cat Access.log |awk ' {print $9} ' |sort|uniq-c|sort-rn

11. Concurrency per Second:

awk ' {if ($9~/200|30|404/) count[$4]++}end{for (A in COUNT) print A,count[a]} ' |sort-k 2-nr|head-n10

12. Bandwidth Statistics

Cat Apache.log |awk ' {if ($7~/get/) count++}end{print "client_request=" Count} '

Cat Apache.log |awk ' {byte+=$11}end{print ' client_kbyte_out= ' byte/1024″kb '} '

13. Number of statistical objects and average size of objects

Cat Access.log |awk ' {byte+=$10}end{print Byte/nr/1024,nr} '

Cat Access.log |awk ' {if ($9~/200|30/) count[$NF]++}end{for (A in COUNT) print A,count

[a],nr,count[a]/nr*100″% "}

14. Take a 5-minute log

if [$DATE _minute! = $DATE _end_minute]; then #则判断开始时间戳与结束时间戳是否相等START_LINE = ' sed-n "/$DATE _minute/=" $APACHE _log|head-n 1 ' #如果不相等, the line number of the start timestamp is removed, and the line number of the end timestamp

#END_LINE = ' sed-n '/$DATE _end_minute/= "$APACHE _log|tail-n1 '

End_line= ' sed-n '/$DATE _end_minute/= "$APACHE _log|head-n1 ' Sed-n" ${start_line},${end_line}p "$APACHE _log > $ Minute_log # #通过行号, take out 5 minutes of log content and store it in a temporary file

Get_start_time= ' Sed-n "${start_line}p" $APACHE _log|awk-f ' [' {print $} ' |awk ' {print $} ' |

Sed ' s#/# #g ' |sed ' s#:# # ' #通过行号获取取出开始时间戳

Get_end_time= ' Sed-n "${end_line}p" $APACHE _log|awk-f ' [' {print $ {} ' |awk ' {print '} ' |sed

' s#/# #g ' |sed ' s#:# # ' #通过行号获取结束时间戳

15. Spider Analysis

See which spiders are crawling content

/usr/sbin/tcpdump-i Eth0-l-S 0-w-DST Port 80 | Strings | Grep-i User-agent | Grep-i-E ' bot|crawler|slurp|spider '

Website Daily Analysis 2 (Squid article)

2. Statistic Traffic by domain

Zcat squid_access.log.tar.gz| awk ' {print $10,$7} ' |awk ' begin{fs= "[/]"}{trfc[$4]+=$1}end{for

(Domain in TRFC) {printf "%st%dn", Domain,trfc[domain]}} '

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.