Use shell commands to analyze statistics logs and shell commands to collect statistics logs
When you need to collect user log information and analyze user behavior, you can use shell to conveniently retrieve a lot of data and place it in excel for statistics.
For example, for access with the address loadCustomProcess included in the statistics log, sort by access time consumption:
Grep "loadCustomProcess"/home/workflow/socket.txt | awk-F "" '{print $11}' | awk-F ": "'{print $2}' | sort-nr
When "INFO" is displayed, the first 10 records are displayed:
Grep "INFO"/usr/share/tomcat6/logs/flowplatform. log | awk-F "'{print $8}' | sort | uniq-c | sort-nr
Resolution:
Grep "loadCustomProcess"/home/workflow/socket.txt find the row containing loadCustomProcess in the file/home/workflow/socket.txt.
Uniq-c: calculates the number of consecutive occurrences (note that this is continuous). Therefore, before uniq-c, you must use sort to obtain the count effect in SQL.
For the SELECT field in an awk SQL statement, "-F" "indicates that each row of data is separated by space. Each unit is a field, and" $7 "indicates the seventh field.
Sort the results to prepare for the subsequent summary,
Sort-nr is arranged in descending ORDER of numerical values, which is equivalent to order by... DESC in SQL,
Head-10: Top 10
Shell script for log statistics in January ,!
Write the scripts for specific environments by yourself.
Your requirements can be written simply by taking a look at shell. It's not hard at all.
How to use shell commands (non-shell scripts) to filter out the number of logs whose information is "error" and sort the logs in ascending order of the number
There is no example. It can only be said in a general sense,
1. Use grep for filtering, such as grep "error" | Other commands
2. Use the above Pipeline "|" to connect other commands. Use awk to calculate different errors, record them to the array, and sort the output.