Linux Web Server Web site Failure Analysis common commands _linux

Source: Internet
Author: User

Linux Web Server Web site failure analysis, the specific contents are as follows

System Connection Status article:

1. View TCP connection Status

Netstat-nat |awk ' {print $} ' |sort|uniq-c|sort-rn

netstat-n | awk '/^tcp/{++s[$NF]}; End {for (a in S) print A, s[a]} ' or
netstat-n | awk '/^tcp/{++state[$NF]}; End {for (key) print key, "\ T", State[key]} '
netstat-n | awk '/^tcp/{++arr[$NF]}; End {to (k in arr) print K, "T", arr[k]} '

netstat-n |awk '/^tcp/{print $NF} ' |sort|uniq-c|sort-rn

netstat-ant | awk ' {print $NF} ' | Grep-v ' [A-z] ' | Sort | Uniq-c

2. Find requests 20 IP (often used to find the source of attack):

Netstat-anlp|grep 80|grep Tcp|awk ' {print $} ' |awk-f: ' {print $} ' |sort|uniq-c|sort-nr|head-n20 netstat-ant |awk
'/:80/{split ($5,ip, ":"); ++a[ip[1]]}end{for (i in A) print A[i],i} ' |sort-rn|head-n20


3. Use tcpdump to sniff the 80-port access to see who is the tallest

Tcpdump-i ETH0-TNN DST Port 80-c 1000 | Awk-f "." ' {print $. ' $ "." $ "." $} ' | Sort | uniq-c | Sort-nr |head-20

4. Find more time_wait connections

Netstat-n|grep Time_wait|awk ' {print $} ' |sort|uniq-c|sort-rn|head-n20

5. Search for more SYN connections

Netstat-an | grep SYN | awk ' {print $} ' | Awk-f: ' {print $} ' | Sort | uniq-c | Sort-nr | More

6. According to the port column process

NETSTAT-NTLP | grep 80 | awk ' {print $} ' | Cut-d/-F1

Web log Analysis 1 (Apache):

1. Access to the top 10 IP addresses

Cat Access.log|awk ' {print} ' |sort|uniq-c|sort-nr|head-10
cat Access.log|awk ' {counts[$ (11)]+=1}; End {to (URL in counts) print Counts[url], url} '


2. The most visited file or page, take the top 20

Cat Access.log|awk ' {print $11} ' |sort|uniq-c|sort-nr|head-20

3. List the maximum number of EXE files to transmit (when analyzing download stations)

Cat Access.log |awk ' ($7~/.exe/) {print $ "$" "$" "$}" |sort-nr|head-20

4. list exe files with output greater than 200000byte (about 200kb) and the number of corresponding file occurrences

Cat Access.log |awk ' ($ > 200000 && $7~/.exe/) {print $} ' |sort-n|uniq-c|sort-nr|head-100

5. If the last column of the log records the paging file transfer time, there are the most time-consuming pages listed to the client

Cat Access.log |awk ' ($7~/.php/) {print $NF "$" "$" "$}" |sort-nr|head-100

6. List the most time-consuming pages (more than 60 seconds) and the number of corresponding page occurrences

Cat Access.log |awk ' ($NF > && $7~/.php/) {print $} ' |sort-n|uniq-c|sort-nr|head-100

7. List files with a transmission time of more than 30 seconds

Cat Access.log |awk ' ($NF >) {print $} ' |sort-n|uniq-c|sort-nr|head-20

8. Statistics website Traffic (G)

Cat Access.log |awk ' {sum+=$10} end {print sum/1024/1024/1024} '

9. Statistics 404 of the Connection

Awk ' ($ ~/404/) ' Access.log | awk ' {print $9,$7} ' | Sort

10. Statistics HTTP Status

Cat Access.log |awk ' {counts[$ (9)]+=1}; End {for (code in counts) print code, Counts[code]} '
Cat Access.log |awk ' {print $} ' |sort|uniq-c|sort-rn

11. Spider analysis to see which spiders are crawling content.

/usr/sbin/tcpdump-i Eth0-l-S 0-w-DST Port 80 | Strings | Grep-i User-agent | Grep-i-E ' bot|crawler|slurp|spider '

Website Day Analysis 2 (Squid article) statistics by domain flow

Zcat squid_access.log.tar.gz| awk ' {print $10,$7} ' |awk ' begin{fs= ' [/] '}{trfc[$4]+=$1}end{for (domain in TRFC) {printf '%st%dn ', Domain,trfc[domain]}} '

Database articles

1. View SQL for database execution

/usr/sbin/tcpdump-i eth0-s 0-l-w-dst Port 3306 | Strings | Egrep-i ' select| update| delete| Insert| Set| Commit| rollback| create| drop| alter| Call '

System Debug Analysis Chapter

1. Debug command

Strace-p PID

2. Track the PID of the specified process

Gdb-p PID

The above is the entire content of this article, I hope to help you learn, but also hope that we support the cloud habitat community.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.