The log files that the following scripts mainly back up are Tomcat's catalina.out, Localhost_access_log.yyyy-mm-dd.log logs, and project log files, where the log file format for the project is " Projectname-yyyy-mm-dd.log ", the following is a backup script, the specific operations have corresponding comments for your reference, depending on the format of the log file and requirements to use this script.
#!/bin/sh ###### # Log timed to run daily 0:1 # delete log files 20 days ago, compress log files before a week # log file time is calculated based on the date following the log name # Run the script note whether there are other non-log files with the same suffix in the log file and whether the log file name meets the requirements ###### #日志文件所在目录 path=/home/tomcat/apache-tomcat-project/logs #进入日志目录 CD $ Path #catalina. Out log file backup #获取前一天的日期 bak_date= ' date +%y-%m-%d-d ' 1 days ago ' #备份catalina. Out log, followed by the addition of a date CP Catalina.out Cata Lina.out. $bak _date.log #清空catalina. Out log file echo > catalina.out #20天之前的日志文件删除 #获取20天之前的日期 del_date= ' Date +%y-%m-%d-d "Days Ago" ' #获取文件名中的日期字符串, and then compare time to do the corresponding operation, localhost_access_log suffix file name is generally txt, here including TXT file for n in ' ls *.log *.txt-1 ';d o m= ' echo $n | Awk-f. ' {print $ (NF-1)} ' m= ' echo ${m:0-10} ' if [[$m < $del _date | | $m = $del _date]];then echo file $n would be delet ed. Rm-rf $n fi done #一周之前的文件压缩 #获取一周之前的日期 zip_date= ' Date +%y-%m-%d-d ' 7-days ago ' #获取文件名中的日期字符串, then compare the time with the corresponding action for n in ' ls *.log *.txt-1 ';d o m= ' echo $n | Awk-f. ' {print $ (NF-1)} ' m= ' echo ${m:0-10} ' echo $n $m if [! $m]; Then echo "is NULL" continue fi if [$m < $zip _date | | $m = $zip _date ]];then echo file $n'll be zip. Zip $n. zip $n rm-rf $n fi done
The above is a small set to introduce the Tomcat log files scheduled to clean up the script, I hope to help you, if you have any questions please give me a message, small series will promptly reply to everyone. Here also thank you very much for the cloud Habitat Community website support!