Common Web server website fault analysis commands

Source: Internet
Author: User

System connection status:
1. View TCP connection status
Netstat-nat | awk '{print $6}' | sort | uniq-c | sort-rn

Netstat-n | awk '/^ tcp/{++ S [$ NF]}; END {for (a in S) print a, S [a]}' or
Netstat-n | awk '/^ tcp/{++ state [$ NF]}; END {for (key in state) print key, "t ", state [key]}'
Netstat-n | awk '/^ tcp/{++ arr [$ NF]}; END {for (k in arr) print k, "t ", arr [k]}'

Netstat-n | awk '/^ tcp/{print $ NF}' | sort | uniq-c | sort-rn

Netstat-ant | awk '{print $ NF}' | grep-v '[a-z]' | sort | uniq-c

2. For more than 20 IP addresses (usually used to find attack sources ):

Netstat-anlp | grep 80 | grep tcp | awk '{print $5}' | awk-F: '{print $1}' | sort | uniq-c | sort-nr | head-n20

Netstat-ant | awk '/: 80/{split ($5, ip ,":"); + A [ip [1]} END {for (I in A) print A [I], I} '| sort-rn | head-n20

3. Use tcpdump to sniff access to port 80 to see who is the highest

Tcpdump-I eth0-tnn dst port 80-c 1000 | awk-F ". "'{print $1 ". "$2 ". "$3 ". "$4} '| sort | uniq-c | sort-nr | head-20


4. Search for more time_wait connections

Netstat-n | grep TIME_WAIT | awk '{print $5}' | sort | uniq-c | sort-rn | head-n20

5. Search for more SYN connections

Netstat-an | grep SYN | awk '{print $5}' | awk-F: '{print $1}' | sort | uniq-c | sort-nr | more

6. Process by port column

Netstat-ntlp | grep 80 | awk '{print $7}' | cut-d/-f1

Website log analysis article 1 (Apache ):

1. Obtain the top 10 IP addresses

Cat access. log | awk '{print $1}' | sort | uniq-c | sort-nr | head-10
Cat access. log | awk '{counts [$ (11)] + = 1}; END {for (url in counts) print counts [url], url }'

2. The most frequently accessed file or page, take the first 20

Cat access. log | awk '{print $11}' | sort | uniq-c | sort-nr | head-20

3. List the largest number of exe files transmitted (commonly used when analyzing download sites)

Cat access. log | awk '($7 ~ /. Exe/) {print $10 "" $1 "" $4 "" $7} '| sort-nr | head-20

4. List the exe files with an output greater than 200000 bytes (about KB) and the number of occurrences of the corresponding files

Cat access. log | awk '($10> 200000 & $7 ~ /. Exe/) {print $7} '| sort-n | uniq-c | sort-nr | header-100

5. If the last column of the log records the page file transfer time, the most time-consuming page is listed on the client.

Cat access. log | awk '($7 ~ /. Php/) {print $ NF "" $1 "" $4 "" $7} '| sort-nr | head-100

6. List the most time-consuming pages (more than 60 seconds) and their occurrence times

Cat access. log | awk '($ NF> 60 & $7 ~ /. Php/) {print $7} '| sort-n | uniq-c | sort-nr | header-100

7. List objects whose transmission time exceeds 30 seconds

Cat access. log | awk '($ NF> 30) {print $7}' | sort-n | uniq-c | sort-nr | head-20

8. Count website traffic (G)

Cat access. log | awk '{sum + = $10} END {print sum/1024/1024/1024 }'

9. Count the connections of 404

Awk '($9 ~ /404/) 'access. log | awk' {print $9, $7} '| sort

10. http status statistics

Cat access. log | awk '{counts [$ (9)] + = 1}; END {for (code in counts) print code, counts [code]}'
Cat access. log | awk '{print $9}' | sort | uniq-c | sort-rn

10. Analyze the spider and check which spider is capturing the content.

/Usr/sbin/tcpdump-I eth0-l-s 0-w-dst port 80 | strings | grep-I user-agent | grep-I-E 'bot | crawler | slurp | spider'

Website daily Analysis 2 (Squid) collects traffic by domain

Zcat squid_access.log.tar.gz | awk '{print $10, $7} '| awk' BEGIN {FS = "[/]"} {trfc [$4] + = $1} END {for (domain in trfc) {printf "% st % dn", domain, trfc [domain]}'

Database
1. view the SQL statement executed by the database

/Usr/sbin/tcpdump-I eth0-s 0-l-w-dst port 3306 | strings | egrep-I 'SELECT | UPDATE | DELETE | INSERT | SET | COMMIT | ROLLBACK | CREATE | DROP | ALTER | CALL'

System Debug Analysis
1. debug commands
Strace-p pid
2. Trace the PID of a specified process
Gdb-p pid

Awk '{ip [$1] + = 1} END {for (I in ip) {print I, "" ip [I]}' access. log | wc-l
Access. log is the apache access log. This allows Statistics on independent ip data.

1. output the number of IP addresses directly:
cat access_log_2011_06_26.log |awk '{print $1}'|uniq -c|wc -l

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.