Own small website runs in the Aliyun ECS above, occasionally also analyzes own website server log, looks at the website the amount of visits. See if there's any black-and-wide sabotage! So collect, organize some server log Analysis command, we can try!
1, see how many IP access:
awk '{print $1}' log_file|sort|uniq|wc -l
Ps:wc-l See how many lines
2. View the number of times a page has been accessed:
grep "/index.php" log_file | wc -l
3, see how many pages per IP access:
awk '{++S[$1]} END {for (a in S) print a,S[a]}' log_file > log.txt
sort -n -t ' ' -k 2 log.txt # 配合sort进一步排序
4, each IP access to the number of pages from small to large sort:
awk '{++S[$1]} END {for (a in S) print S[a],a}' log_file | sort -n
5, look at an IP access to which pages:
grep ^111.111.111.111 log_file| awk '{print $1,$7}'
6, remove the Search engine statistics page:
awk '{print $12,$1}' log_file | grep ^\"Mozilla | awk '{print $2}' |sort | uniq | wc -l
7, view August 16, 2015 14 o'clock in the one hours of the number of IP access:
awk '{print $4,$1}' log_file | grep 16/Aug/2015:14 | awk '{print $2}'| sort | uniq | wc -l
8, view access to the first 10 IP addresses
awk ' {print} ' |sort|uniq-c|sort-nr |head-10 access_log
uniq-c the equivalent of grouping statistics and putting statistics on the front
cat Access.log|awk ' {print ' |sort|uniq-c|sort-nr|head-10
cat Access.log|awk ' {counts[$ (11)]+=1}; End {to (URL in counts) print Counts[url], url}
9, the maximum number of visits to the 10 files or pages
Cat Log_file|awk ' {print $11} ' |sort|uniq-c|sort-nr | Head-10
cat Log_file|awk ' {print $11} ' |sort|uniq-c|sort-nr|head-20
awk ' {print '} ' log_file |sort-n-R |uniq -C | Sort-n-R | HEAD-20 # access to the top 20 IP
10, the number of visits through the subdomain, according to Referer to calculate, slightly not allowed
cat access.log | awk '{print $11}' | sed -e ' s/http:\/\///' -e ' s/\/.*//' | sort | uniq -c | sort -rn | head -20
11, list the largest transmission size of several files
cat www.access.log |awk '($7~/\.php/){print $10 " " $1 " " $4 " " $7}'|sort -nr|head -100
12, list the output of more than 200000byte (about 200kb) and the number of corresponding pages
cat www.access.log |awk '($10 > 200000 && $7~/\.php/){print $7}'|sort -n|uniq -c|sort -nr|head -100
13. If the last column of the log records the paging file transfer time, there are the most time-consuming pages listed to the client
cat www.access.log |awk '($7~/\.php/){print $NF " " $1 " " $4 " " $7}'|sort -nr|head -100
14, list the most time-consuming pages (more than 60 seconds) and the number of corresponding page occurrences
cat www.access.log |awk '($NF > 60 && $7~/\.php/){print $7}'|sort -n|uniq -c|sort -nr|head -100
15, list the transmission time of more than 30 seconds of files
cat www.access.log |awk '($NF > 30){print $7}'|sort -n|uniq -c|sort -nr|head -20
16, list the current server to run the number of each process, in reverse order
ps -ef | awk -F ' ' '{print $8 " " $9}' |sort | uniq -c |sort -nr |head -20
17, view the current number of concurrent access Apache
Compare the difference in the number of maxclients in httpd.conf.
netstat -an | grep ESTABLISHED | wc -l
18, you can use the following parameters to view the data
ps -ef|grep httpd|wc -l
Count the number of httpd processes, and even a request initiates a process for use with the Apache server.
Indicates that Apache can handle 1388 concurrent requests, which can be adjusted automatically according to the load.
netstat -nat|grep -i "80"|wc -l
Netstat-an will print the current network link state of the system, and Grep-i "80" is used to extract connections related to 80 ports, wc-l count the number of connections.
The final number returned is the total number of requests currently on all 80 ports.
netstat -na|grep ESTABLISHED|wc -l
Netstat-an Prints the current network link state of the system, and grep established extracts the information that has been established for the connection. Then wc-l statistics.
The final number returned is the total number of established connections for all 80 ports currently.
netstat -nat||grep ESTABLISHED|wc
To view detailed records of all established connections
19, output the number of connections per IP, as well as the total number of connections in each state
Netstat-n | awk '/^tcp/{n=split ($ (NF-1), Array, ":"); if (n<=2) ++s[array[(1)]];else++s[array[(4)]];++s[$NF];++n} end {for (a) ) {printf ("%-20s%s\n", A, S[a]), ++i}printf ("%-20s%s\n", "Total_ip", I), for (a in s) printf ("%-20s%s\n", A, s[a));p rintf (" %-20s%s\n "," Total_link ", n);} '
20. Other Collection
Analysis log file next 2012-05-04 visit the top 20 URLs of the page and sort
cat access.log |grep '04/May/2012'| awk '{print $11}'|sort|uniq -c|sort -nr|head -20
Query the URL address of the visited page contains the IP address of the www.abc.com URL
cat access_log | awk '($11~/\www.abc.com/){print $1}'|sort|uniq -c|sort -nr
Gets access to the highest 10 IP addresses and can also be queried by time
cat linewow-access.log|awk '{print $1}'|sort|uniq -c|sort -nr|head -10
Time period query log time period
cat log_file | egrep '15/Aug/2015|16/Aug/2015' |awk '{print $1}'|sort|uniq -c|sort -nr|head -10
Analyze IP reverse arrangement of "/index.php?g=member&m=public&a=sendvalidcode" accessed from 2015/8/15 to 2015/8/16
cat log_file | egrep '15/Aug/2015|16/Aug/2015' | awk '{if($7 == "/index.php?g=Member&m=Public&a=sendValidCode") print $1,$7}'|sort|uniq -c|sort -nr
($7~/.php/) $ contains. PHP output, this sentence means the most time-consuming 100 PHP pages
cat log_file |awk '($7~/\.php/){print $NF " " $1 " " $4 " " $7}'|sort -nr|head -100
Lists the most time-consuming pages (more than 60 seconds) and the number of corresponding pages
cat access.log |awk '($NF > 60 && $7~/.php/){print $7}'|sort -n|uniq -c|sort -nr|head -100
Statistics website Traffic (G)
cat access.log |awk '{sum+=$10} END {print sum/1024/1024/1024}'
Statistics 404 of the Connection
awk '($9 ~/404/)' access.log | awk '{print $9,$7}' | sort
Statistics HTTP status.
Cat Access.log |awk ' {counts[$ (9)]+=1}; End {for (code in counts) print code, Counts[code]} '
Cat Access.log |awk ' {print $} ' |sort|uniq-c|sort-rn
Concurrency per second:
Watch ' awk ' {if ($9~/200|30|404/) count[$4]++}end{for (A in COUNT) print A,count[a]} ' log_file|sort-k 2-nr|head-n10 '
Bandwidth statistics
Cat Apache.log |awk ' {if ($7~/get/) count++}end{print ' client_request= ' Count} '
Cat Apache.log |awk ' {byte+=$11}end{print ' client_kbyte_out= ' byte/1024 ' KB '} '
Find out the 10 most visited IP days
Cat/tmp/access.log | grep "20/mar/2011" |awk ' {print $} ' |sort |uniq-c|sort-nr|head
What IP connections have the highest number of IP links in the day:
Cat Access.log | grep "10.0.21.17" | awk ' {print $} ' | Sort | uniq-c | Sort-nr | Head-n 10
The maximum number of IP connections in a unit of 10 hours
Awk-vfs= "[:]" ' {gsub ("-.*", "", $); num[$2 "$1]++}end{for (i in num) print I,num[i]} ' log_file | Sort-n-K 3-r | Head-10
Find the number of minutes to visit most
awk ' {print $} ' Access.log | grep "20/mar/2011" |cut-c 14-18|sort|uniq-c|sort-nr|head
Take a 5 minute log
If [$DATE _minute!= $DATE _end_minute]; then #则判断开始时间戳与结束时间戳是否相等START_LINE =sed-n "/$DATE _minute/=" $APACHE _log|head-n1 #如果不相等, the line number of the start timestamp is taken out, and the line number of the end time stamp
To view the link state of TCP
Netstat-nat |awk ' {print $} ' |sort|uniq-c|sort-rn
netstat-n | awk '/^tcp/{++s[$NF]}; End {for (a in S) print A, S[a]} '
netstat-n | awk '/^tcp/{++state[$NF]}; End {for (key) print key, "\ T", State[key]} '
netstat-n | awk '/^tcp/{++arr[$NF]}; End {to (k in arr) print K, "\ T", arr[k]} '
netstat-n |awk '/^tcp/{print $NF} ' |sort|uniq-c|sort-rn
netstat-ant | awk ' {print $NF} ' | Grep-v ' [A-z] ' | Sort | Uniq-c
Netstat-ant|awk '/ip:80/{split ($5,ip, ":"); ++s[ip[1]]}end{for (A in S) print S[a],a} ' |sort-n
netstat-a Nt|awk '/:80/{split ($5,ip, ":"); ++s[ip[1]]}end{for (A in S) print s[a],a} ' |sort-rn|head-n '
awk ' begin{printf ("htt P_code\tcount_num\n ")}{count[$10]++}end{for (A in COUNT) printf a" \t\t "count[a]" \ n "} '
Top 20 IP Lookup requests (often used to find sources of attack):
Netstat-anlp|grep 80|grep Tcp|awk ' {print $} ' |awk-f: ' {print $} ' |sort|uniq-c|sort-nr|head-n20 netstat-ant |awk
'/:80/{split ($5,ip, ":"); ++a[ip[1]]}end{for (i in A) print A[i],i} ' |sort-rn|head-n20
Use tcpdump to sniff 80-port access to see who is the tallest
Tcpdump-i ETH0-TNN DST Port 80-c 1000 | Awk-f "." ' {print $. ' $ "." $ "." $} ' | Sort | uniq-c | Sort-nr |head-20
Find more time_wait connections
Netstat-n|grep Time_wait|awk ' {print $} ' |sort|uniq-c|sort-rn|head-n20
Check for more SYN connections.
Netstat-an | grep SYN | awk ' {print $} ' | Awk-f: ' {print $} ' | Sort | uniq-c | Sort-nr | More
According to the port column process
NETSTAT-NTLP | grep 80 | awk ' {print $} ' | Cut-d/-F1
Viewed the number of connections and the current number of connections
Netstat-ant | grep $IP: 80 | Wc-l
Netstat-ant | grep $IP: 80 | grep EST | Wc-l
View IP access times
netstat -nat|grep ":80"|awk '{print $5}' |awk -F: '{print $1}' | sort| uniq -c|sort -n
Linux command to analyze current link status
Netstat-n | awk '/^tcp/{++s[$NF]} end {to (a in S) print A, S[a]} '
Watch ' netstat-n | awk '/^tcp/{++s[$NF]} end {to (a in S) print A, S[a]} ' "# Through Watch can monitor all the time
Last_ack 5 Shutting down a TCP connection needs to be shut down in two directions, both by sending the fin to indicate the closure of the single direction data, and when the communication parties send the last fin, the sender is in the Last_ack state at this time, When the sender receives the confirmation of the other party (the ACK of the fin is confirmed), it really shuts down the entire TCP connection;
SYN_RECV 30 # Indicates the number of requests waiting to be processed;
Established 1597 # indicates normal data transfer status;
Fin_wait1 51 # Indicates that the server side actively requests the shutdown of the TCP connection;
Fin_wait2 504 # Indicates a client interrupt connection;
Time_wait 1057 # Indicates the number of requests that have been processed, waiting for the timeout to end;