Due to the special requirements of the project, the daily logs of the project need to be retained, and the log files are very large. After a long time, the disk usage will be affected. Therefore, you must regularly compress and back up the previous log files, here, I compress the package file through the shell script and then add it to the linux scheduled task for processing. Although the shell script is very simple, there are several lines, but the problems encountered are described as follows:
First, I directly used the tar command to package the log files in the log directory. After packaging, I deleted the log files. My shell script is as follows:
#! /Bin/sh
# Gztest project log compression program
Cd/var/www/gztest/Temp/log & rm-rf./*. txt
# Package and compress log files
Logdir = $ (date when policy-policm-mongod=.log.tar
Tar-jcf $ {logdir}./*. log & rm-rf./$ {logdir}
Run the following command: sh-x/root/logzip. sh to test the script execution process:
When tar is packaged, it prompts "the file has changed when we read the file". I think you can think of the reason, because a program has been writing content to this log file, an exception is thrown here, resulting in the subsequent command not being executed.
Later, I changed the method. First, I copied the log file to a temporary directory, then packed the temporary directory, and finally deleted the temporary directory and the files that have been packaged. The shell script is as follows:
#! /Bin/sh
# Gztest project log compression program
Cd/var/www/gztest/Temp/log & rm-rf./*. txt
# Package and compress log files
Logdir = $ (date + % Y-% m-% d)
Mkdir $ {logdir}
Find./-name '*. log'-mtime + 0-type f-exec cp-r {}./$ {logdir }\;
Tar-jcf zookeeper logdir).log.tar./$ {logdir }/*
Rm-rf./$ {logdir}
Find./-name '*. log'-mtime + 0-type f-exec rm-rf {}\;
The test is successful.