1. view TCP connection status netstat-nat | awk & #39; {print $6} & #39; | sort | uniq-c | sort-rnnetstat-n | awk & #39; /^ tcp/{++ S [$ NF]}; END {for (ainS) printa, S [a]} & #39; or netstat-n | awk & #39;/^ tcp /{
1. View TCP connection status
Netstat-nat | awk '{print $6}' | sort | uniq-c | sort-rn
Netstat-n | awk '/^ tcp/{++ S [$ NF]}; END {for (a in S) print a, S [a]}' or
Netstat-n | awk '/^ tcp/{++ state [$ NF]}; END {for (key in state) print key, "\ t ", state [key]}'
Netstat-n | awk '/^ tcp/{++ arr [$ NF]}; END {for (k in arr) print k, "\ t ", arr [k]}'
Netstat-n | awk '/^ tcp/{print $ NF}' | sort | uniq-c | sort-rn
Netstat-ant | awk '{print $ NF}' | grep-v '[a-z]' | sort | uniq-c
2. For more than 20 IP addresses (usually used to find attack sources ):
Netstat-anlp | grep 80 | grep tcp | awk '{print $5}' | awk-F: '{print $1}' | sort | uniq-c | sort-nr | head-n20
Netstat-ant | awk '/: 80/{split ($5, ip ,":"); + A [ip [1]} END {for (I in A) print A [I], I} '| sort-rn | head-n20
3. Use tcpdump to sniff access to port 80 to see who is the highest
Tcpdump-I eth0-tnn dst port 80-c 1000 | awk-F ". "'{print $1 ″. "$2 ″. "$3 ″. "$4} '| sort | uniq-c | sort-nr | head-20
4. Search for more time_wait connections
Netstat-n | grep TIME_WAIT | awk '{print $5}' | sort | uniq-c | sort-rn | head-n20
5. Search for more SYN connections
Netstat-an | grep SYN | awk '{print $5}' | awk-F: '{print $1}' | sort | uniq-c | sort-nr | more
6. Process by port column
Netstat-ntlp | grep 80 | awk '{print $7}' | cut-d/-f1
Website log analysis article 1 (Apache ):
1. Obtain the top 10 IP addresses
Cat access. log | awk '{print $1}' | sort | uniq-c | sort-nr | head-10
Cat access. log | awk '{counts [$ (11)] + = 1}; END {for (url in counts) print counts [url], url }'
2. The most frequently accessed file or page, take the first 20
Cat access. log | awk '{print $11}' | sort | uniq-c | sort-nr | head-20
3. List the largest number of exe files transmitted (commonly used when analyzing download sites)
Cat access. log | awk '($7 ~ /\. Exe/) {print $10 "" $1 "" $4 "" $7} '| sort-nr | head-20
4. List the exe files with an output greater than 200000 bytes (about KB) and the number of occurrences of the corresponding files
Cat access. log | awk '($10> 200000 & $7 ~ /\. Exe/) {print $7} '| sort-n | uniq-c | sort-nr | header-100
5. If the last column of the log records the page file transfer time, the most time-consuming page is listed on the client.
Cat access. log | awk '($7 ~ /\. Php/) {print $ NF "" $1 "" $4 "" $7} '| sort-nr | head-100
6. List the most time-consuming pages (more than 60 seconds) and their occurrence times
Cat access. log | awk '($ NF> 60 & $7 ~ /\. Php/) {print $7} '| sort-n | uniq-c | sort-nr | header-100
7. List objects whose transmission time exceeds 30 seconds
Cat access. log | awk '($ NF> 30) {print $7}' | sort-n | uniq-c | sort-nr | head-20
8. Count website traffic (G)
Cat access. log | awk '{sum + = $10} END {print sum/1024/1024/1024 }'
9. Count the connections of 404
Awk '($9 ~ /404/) 'access. log | awk' {print $9, $7} '| sort
10. Count http status.
Cat access. log | awk '{counts [$ (9)] + = 1}; END {for (code in counts) print code, counts [code]}'
Cat access. log | awk '{print $9}' | sort | uniq-c | sort-rn
10. Analyze and check which crawlers are capturing the content.
/Usr/sbin/tcpdump-I eth0-l-s 0-w-dst port 80 | strings | grep-I user-agent | grep-I-E 'bot | crawler | slurp | spider'
Website daily Analysis 2 (Squid)
Traffic Statistics by domain
Zcat squid_access.log.tar.gz | awk '{print $10, $7} '| awk' BEGIN {FS = "[/]"} {trfc [$4] + = $1} END {for (domain in trfc) {printf "% s \ t % d \ n", domain, trfc [domain]}'
For more efficient perl versions, please download: http://docs.linuxtone.org/soft/tools/tr.pl
Database
1. view the SQL statement executed by the database
/Usr/sbin/tcpdump-I eth0-s 0-l-w-dst port 3306 | strings | egrep-I 'SELECT | UPDATE | DELETE | INSERT | SET | COMMIT | ROLLBACK | CREATE | DROP | ALTER | CALL'
System Debug Analysis
1. debug commands
Strace-p pid
2. Trace the PID of a specified process
Gdb-p pid