Own small website runs in the Aliyun ECS above, occasionally also analyzes own website server log, looks at the website the amount of visits. See if there's any black-and-wide sabotage! So collect, organize some server log Analysis command, we can try!
1, see how many IP access:
awk ' {print $} ' log_file|sort|uniq|wc-l
Wc-l See how many lines
2. View the number of times a page has been accessed:
grep "/index.php" Log_file | Wc-l
3, see how many pages per IP access:
awk ' {++s[$1]} end {for (a in S) print A,s[a]} ' log_file > Log.txt
Sort-n-T ' K 2 Log.txt # with sort further sorted
4, each IP access to the number of pages from small to large sort:
awk ' {++s[$1]} end {for (a in S) print S[a],a} ' log_file | Sort-n
5, look at an IP access to which pages:
grep ^111.111.111.111 log_file| awk ' {print $1,$7} '
6, remove the Search engine statistics page:
awk ' {print $12,$1} ' log_file | grep ^\ "Mozilla | awk ' {print $} ' |sort | Uniq | Wc-l
7, view August 16, 2015 14 o'clock in the one hours of the number of IP access:
awk ' {print $4,$1} ' log_file | grep 16/aug/2015:14 | awk ' {print $} ' | Sort | Uniq | Wc-l
8, view access to the first 10 IP addresses
awk ' {print $} ' |sort|uniq-c|sort-nr |head-10 access_log
Uniq-c is the equivalent of grouping statistics and putting statistics on the front.
Cat Access.log|awk ' {print $} ' |sort|uniq-c|sort-nr|head-10
Cat Access.log|awk ' {counts[$ (11)]+=1}; End {to (URL in counts) print Counts[url], url}
9, the maximum number of visits to the 10 files or pages
Cat Log_file|awk ' {print $11} ' |sort|uniq-c|sort-nr | Head-10
Cat Log_file|awk ' {print $11} ' |sort|uniq-c|sort-nr|head-20
awk ' {print $} ' log_file |sort-n-R |uniq-c | Sort-n-R | HEAD-20 # access to the top 20 IP
10, the number of visits through the subdomain, according to Referer to calculate, slightly not allowed
Cat Access.log | awk ' {print $11} ' | Sed-e ' s/http:\/\///'-e ' s/\/.*//' | Sort | uniq-c | Sort-rn | Head-20
11, list the largest transmission size of several files
Cat www.111cn.net |awk ' ($7~/\.php/) {print $ "$" "$" "$}" |sort-nr|head-100
12, list the output of more than 200000byte (about 200kb) and the number of corresponding pages
Cat Www.111cn.net |awk ' ($ > 200000 && $7~/\.php/) {print $} ' |sort-n|uniq-c|sort-nr|head-100
13. If the last column of the log records the paging file transfer time, there are the most time-consuming pages listed to the client
Cat www.111cn.net |awk ' ($7~/\.php/) {print $NF "$" "$" "$}" |sort-nr|head-100
14, list the most time-consuming pages (more than 60 seconds) and the number of corresponding page occurrences
Cat www.111cn.net |awk ' ($NF > && $7~/\.php/) {print $} ' |sort-n|uniq-c|sort-nr|head-100
15, list the transmission time of more than 30 seconds of files
Cat www.111cn.net |awk ' ($NF >) {print $} ' |sort-n|uniq-c|sort-nr|head-20
16, list the current server to run the number of each process, in reverse order
Ps-ef | Awk-f ' ' {print $ ' "$} ' |sort | Uniq-c |sort-nr |head-20
17, view the current number of concurrent access Apache
The number of maxclients in the #对比httpd. conf is somewhat different.
Netstat-an | grep established | Wc-l
18, you can use the following parameters to view the data
Ps-ef|grep httpd|wc-l
#1388
#统计httpd进程数, even a request initiates a process for use with the Apache server.
#表示Apache能够处理1388个并发请求, this value Apache can be adjusted automatically according to the load condition.
Netstat-nat|grep-i "|wc-l"
#4341
#netstat-an will print the current network link state of the system, and Grep-i "80" is used to extract connections related to port 80, wc-l count the number of connections.
#最终返回的数字就是当前所有80端口的请求总数.
Netstat-na|grep established|wc-l
#376
#netstat-an Prints the current network link state of the system, and grep established extracts the information that has been established for the connection. Then wc-l statistics.
#最终返回的数字就是当前所有80端口的已建立连接的总数.
netstat-nat| | grep ESTABLISHED|WC
#可查看所有建立连接的详细记录
19, output the number of connections per IP, as well as the total number of connections in each state
Netstat-n | awk '/^tcp/{n=split ($ (NF-1), Array, ":"); if (n<=2) ++s[array[(1)]];else++s[array[(4)]];++s[$NF];++n} end {for (a) ) {printf ("%-20s%s\n", A, S[a]), ++i}printf ("%-20s%s\n", "Total_ip", I), for (a in s) printf ("%-20s%s\n", A, s[a));p rintf (" %-20s%s\n "," Total_link ", n);} '
20. Other Collection
Analysis log file next 2012-05-04 visit the top 20 URLs of the page and sort
Cat Access.log |grep ' 04/may/2012 ' | awk ' {print $11} ' |sort|uniq-c|sort-nr|head-20
Query the URL address of the visited page contains the IP address of the www.abc.com URL
Cat Access_log | awk ' ($11~/\www.abc.com/) {print $} ' |sort|uniq-c|sort-nr
Gets access to the highest 10 IP addresses and can also be queried by time
Cat Linewow-access.log|awk ' {print $} ' |sort|uniq-c|sort-nr|head-10
#时间段查询日志时间段的情况
Cat Log_file | Egrep ' 15/aug/2015|16/aug/2015 ' |awk ' {print $} ' |sort|uniq-c|sort-nr|head-10
Analyze IP reverse arrangement of "/index.php?g=member&m=public&a=sendvalidcode" accessed from 2015/8/15 to 2015/8/16
Cat Log_file | Egrep ' 15/aug/2015|16/aug/2015 ' | awk ' {if ($ = = '/index.php?g=member&m=public&a=sendvalidcode ') print $1,$7} ' |sort|uniq-c|sort-nr
# ($7~/.php/) $ contains. PHP on the output, this sentence means the most time-consuming 100 PHP pages
Cat Log_file |awk ' ($7~/\.php/) {print $NF "$" "$" "$}" |sort-nr|head-100
Lists the most time-consuming pages (more than 60 seconds) and the number of corresponding pages
Cat Access.log |awk ' ($NF > && $7~/\.php/) {print $} ' |sort-n|uniq-c|sort-nr|head-100
Statistics website Traffic (G)
Cat Access.log |awk ' {sum+=$10} end {print sum/1024/1024/1024} '
Statistics 404 of the Connection
Awk ' ($ ~/404/) ' Access.log | awk ' {print $9,$7} ' | Sort
Statistics HTTP status.
Cat Access.log |awk ' {counts[$ (9)]+=1}; End {for (code in counts) print code, Counts[code]} '
Cat Access.log |awk ' {print $} ' |sort|uniq-c|sort-rn
Concurrency per second:
Watch ' awk ' {if ($9~/200|30|404/) count[$4]++}end{for (A in COUNT) print A,count[a]} ' log_file|sort-k 2-nr|head-n10 '
Bandwidth statistics
Cat Apache.log |awk ' {if ($7~/get/) count++}end{print ' client_request= ' Count} '
Cat Apache.log |awk ' {byte+=$11}end{print ' client_kbyte_out= ' byte/1024 ' KB '} '
Find out the 10 most visited IP days
Cat/tmp/access.log | grep "20/mar/2011" |awk ' {print $} ' |sort |uniq-c|sort-nr|head
What IP connections have the highest number of IP links in the day:
Cat Access.log | grep "10.0.21.17" | awk ' {print $} ' | Sort | uniq-c | Sort-nr | Head-n 10
The maximum number of IP connections in a unit of 10 hours
Awk-vfs= "[:]" ' {gsub ("-.*", "", $); num[$2 "$1]++}end{for (i in num) print I,num[i]} ' log_file | Sort-n-K 3-r | Head-10
Find the number of minutes to visit most
awk ' {print $} ' Access.log | grep "20/mar/2011" |cut-c 14-18|sort|uniq-c|sort-nr|head
Take a 5 minute log
If [$DATE _minute!= $DATE _end_minute]; then #则判断开始时间戳与结束时间戳是否相等START_LINE =sed-n "/$DATE _minute/=" $APACHE _log|head-n1 #如果不相等, the line number of the start timestamp is taken out, and the line number of the end time stamp
To view the link state of TCP
Netstat-nat |awk ' {print $} ' |sort|uniq-c|sort-rn
Netstat-n | awk '/^tcp/{++s[$NF]}; End {for (a in S) print A, s[a]} '
Netstat-n | awk '/^tcp/{++state[$NF]}; End {for (key) print key, "\ T", State[key]} '
Netstat-n | awk '/^tcp/{++arr[$NF]}; End {to (k in arr) print K, "\ T", arr[k]} '
Netstat-n |awk '/^tcp/{print $NF} ' |sort|uniq-c|sort-rn
Netstat-ant | awk ' {print $NF} ' | Grep-v ' [A-z] ' | Sort | Uniq-c
Netstat-ant|awk '/ip:80/{split ($5,ip, ":"); ++s[ip[1]]}end{for (A in S) print S[a],a} ' |sort-n
Netstat-ant|awk '/:80/{split ($5,ip, ":"); ++s[ip[1]]}end{for (A in S) print S[a],a} ' |sort-rn|head-n 10
awk ' begin{printf ("http_code\tcount_num\n")}{count[$10]++}end{for (A in COUNT) printf a "\t\t" count[a] "\ n"} '
#查找请求数前20个IP (often used to find sources of attack):
Netstat-anlp|grep 80|grep Tcp|awk ' {print $} ' |awk-f: ' {print $} ' |sort|uniq-c|sort-nr|head-n20
Netstat-ant |awk '/:80/{split ($5,ip, ":"); ++a[ip[1]]}end{for (i in A) print A[i],i} ' |sort-rn|head-n20
#用tcpdump嗅探80端口的访问看看谁最高
Tcpdump-i ETH0-TNN DST Port 80-c 1000 | Awk-f "." ' {print $. ' $ "." $ "." $} ' | Sort | uniq-c | Sort-nr |head-20
Find more time_wait connections
Netstat-n|grep Time_wait|awk ' {print $} ' |sort|uniq-c|sort-rn|head-n20
Check for more SYN connections.
Netstat-an | grep SYN | awk ' {print $} ' | Awk-f: ' {print $} ' | Sort | uniq-c | Sort-nr | More
According to the port column process
NETSTAT-NTLP | grep 80 | awk ' {print $} ' | Cut-d/-F1
Viewed the number of connections and the current number of connections
Netstat-ant | grep $IP: 80 | Wc-l
Netstat-ant | grep $IP: 80 | grep EST | Wc-l
View IP access times
Netstat-nat|grep ":" |awk ' {print $} ' |awk-f: ' {print $} ' | Sort| Uniq-c|sort-n
Linux command to analyze current link status
Netstat-n | awk '/^tcp/{++s[$NF]} end {for (a in S) print A, s[a]} '
Watch "Netstat-n | awk '/^tcp/{++s[\ $NF]} end {to (a in S) print A, S[a]} ' "# Through Watch can monitor all the time
Last_ack 5 #关闭一个TCP连接需要从两个方向上分别进行关闭, both sides are sent by the fin to indicate the closure of the single direction of data, when the communication between the two sides sent the last fin, the sender at this time in Last_ack State, When the sender receives the confirmation of the other party (the ACK of the fin is confirmed), it really shuts down the entire TCP connection;
SYN_RECV 30 # Indicates the number of requests waiting to be processed;
Established 1597 # indicates normal data transfer status;
Fin_wait1 51 # Indicates that the server side actively requests the shutdown of the TCP connection;
Fin_wait2 504 # Indicates a client interrupt connection;
Time_wait 1057 # Indicates the number of requests that have been processed, waiting for the timeout to end;