Background any program running will inevitably generate a lot of logs, of which error logs need to be the most concerned. In some cases, the error log and the normal log are separated, but our system does not. What's more troublesome is that one log file is stored every hour, so N files need to be opened every time to find out whether there is an error message on the day, and grep cannot be used because the entire stack needs to be captured.
SHELL is a beginner and has written a script. The main logic is to determine the start record of the line at the Log Level of the ERROR until it encounters the next log level line of INFO or DEBUG.
#!/bin/bashisInErrorBlock=falsestrArray=()count=1while read linedo #echo "Line $count:" arrayIdx=$count%10 strArray[$count]="$line" idx=$(expr match "${strArray[$count]}" ".*ERROR") if [ $idx -le 0 ]; thenidx=$(expr match "${strArray[$count]}" ".*Exception*") fi idx2=$(expr match "${strArray[$count]}" ".*DEBUG") idx3=$(expr match "${strArray[$count]}" ".*INFO") if [ $idx -gt 0 ] ; thenif ! $isInErrorBlock;thenecho 'print out previous lines'isInErrorBlock=trueecho ${strArray[$count]}fi elseif [ $idx2 -gt 0 ] || [ $idx3 -gt 0 ] ;thenisInErrorBlock=falseelseif $isInErrorBlock;thenecho ${strArray[$count]} fifi fi count=$[ $count+1 ]done#echo "finish\n"exit 0
The results basically meet the requirements. For each directory cat * log | sh find_error.sh, the entire stack can be output to a single file separately. But the efficiency is very slow. I don't know how to improve the SHELL Running Efficiency.