awk Learning Notes
Recently added a few features of the log, but, this log is the output, general himself also found not the problem, so want to write some simple monitoring script to see the general situation of the log,
For example, there is no error, how many error reported every day. Think of the previous ops and share awk, so I want to learn simple.
Entry
The simplest input some columns use $4 to indicate that __$0__ is the output of an entire column
[root]/root/test$ps -ef|grep uwsgi|awk ‘{print $1,$5}’root Jul24root Jul24root Jul24root Jul24root Jul24root Jul24root Jul24root Jul24root Jul24root 18:49
Formatted output:
[root]/root/test$ps -ef|grep uwsgi|awk ‘{printf “%-2s and %-4s \n”,$1,$5}’root and Jul24root and Jul24root and Jul24root and Jul24root and Jul24root and Jul24root and Jul24root and Jul24root and Jul24root and 18:51
Filter judgment
Judging sign! =, <, >=, <=,==
[root]/root/test$ps-ef|grep Uwsgi|awk ' $2==" 20596 "' root 20596 20560 0 Jul24? 00:00:19 uwsgi-x uwsgi.xml# Use the table header (that is, take the first line) Nr[root]/root/test$ps-ef|grep Uwsgi|awk ' $2== ' 20596 "| | nr==1 {print $7} ' 00:00:0100:00:19
Built-in variables:
$0-> current record (this variable holds the contents of the entire row)
$1~ $n the nth field of the current record, separated by FS between fields
fs-> input Field delimiter default is a space or tab
Nf-> the number of fields in the current record, that is, how many columns
Nr-> has read the number of records, is the line number, starting from 1, if there are multiple file words, this value is constantly accumulating.
Fnr-> current Record Count, unlike NR, this value will be the individual file's own line number
Rs-> The record delimiter entered, default to line break
ofs-> output field delimiter, default is also a space
Ors-> the record delimiter for output, default to line break
Filename-> the name of the current input file
Remove a specific column and display the line number:
[root]/root/test$ps -ef|grep uwsgi|awk ‘$2==”20596” || NR==1 {printf “No %s, %s \n”,NR,$7}’No 1, 00:00:01No 6, 00:00:19
Specify the separator:
[root]/root/test$awk ‘BEGIN{FS=”:”} {print $1,$3,$6}’ /etc/passwdroot 0 /rootbin 1 /bindaemon 2 /sbinadm 3 /var/adm#也可以写成awk -F: ‘{print $1,$3,$6}’ /etc/passwd
The notation of multiple separators: Awk-f ' [;:] '
Using the regular
$cat test.log2014-07-21 20:00:53,379-charge-info-30748-contract_no=chuangfu-mids-13062014-07-21 20:00:53,406-cha Rge-info-30748-contract_no=chuangfu-mids-13062014-07-21 20:00:53,431-charge-info-30748-contract_no=chuangfu- Mids-13062014-07-21 20:00:53,543-charge-info-30748-contract_no=vvvgame-ccdl-13072014-07-24 16:00:34,356-charge- Info-18338-contract_no=sennheiser-cc-14052014-07-24 16:00:34,394-charge-info-18338-contract_no=sennheiser-cc- 14052014-07-24 16:04:24,431-charge-info-19081-contract_no=sennheiser-cc-14052014-07-24 16:04:24,479-charge-inf O-19081-contract_no=sennheiser-cc-14052014-07-24 16:07:20,349-charge-info-19081-contract_no=sennheiser-cc-1405 2014-07-24 16:07:20,390-charge-info-19081-contract_no=sennheiser-cc-1405[root]/application/2.0/nirvana/logs$awk ' $ $ ~/mids/{print nr,$1,$2} ' test.log1 2014-07-21 20:00:53,3792 2014-07-21 20:00:53,4063 2014-07-21 20:00:53,431
Here ~ is the beginning of the pattern, if the mode is reversed using!~,//is a regular expression
Put the results in a file and use redirection directly.
grouping files by using if else redirection
$awk ‘{if($10 ~ /MIDS/) print > “mids.txt”;else if($6 ~ /CCDL/) print > “ccdl.txt”; else print > “cc.txt”}’ test.log
Demo Small Case
#计算log文件的大小$ls -l *.log|awk ‘{sum+=$5} END {print sum}’102610686#打印99乘法表$seq 9 | sed ‘H;g’ | awk -v RS=’‘ ‘{for(i=1;i<=NF;i++)printf(“%dx%d=%d%s”, i, NR, i*NR, i==NR?”\n”:”\t”)}’1x1=11x2=2 2x2=41x3=3 2x3=6 3x3=91x4=4 2x4=8 3x4=12 4x4=161x5=5 2x5=10 3x5=15 4x5=20 5x5=251x6=6 2x6=12 3x6=18 4x6=24 5x6=30 6x6=361x7=7 2x7=14 3x7=21 4x7=28 5x7=35 6x7=42 7x7=491x8=8 2x8=16 3x8=24 4x8=32 5x8=40 6x8=48 7x8=56 8x8=641x9=9 2x9=18 3x9=27 4x9=36 5x9=45 6x9=54 7x9=63 8x9=72 9x9=81
How to write a script, back in the study
Reference
awk Concise Tutorial
This article is derived from"Orangleliu Notebook"Blog, be sure to keep this source http://blog.csdn.net/orangleliu/article/details/38357071