Because previously did not do the Nginx log segmentation, sometimes want to read the log always found that there are more than 10 g or even hundreds of g of log files, so you want to use Python to write an nginx log segmentation (of course you can also use the shell to complete is very simple)
Demand:
1. In accordance with the daily partition Nginx all logs
2. Because the log does not need to be viewable at any time, it needs to be archived (compressed. tar.gz)
3. Archived logs require an expiration time and capacity limit (the split log needs to be deleted periodically, more than a certain amount of time or the directory size exceeds a certain capacity)
Analysis:
As required, we need to back up the previous day log----> reload the log file----> compress the log file----> detect the Backup log file directory
Well, not one after the analysis of the script, issued directly, there is any problem we can mention AH.
#coding: Utf-8 ' VERSION:1.0.0ACTOR:YOUSHUMINDATE:2018/04/28 Scripting Requirements--1. User provides an nginx log path (directory) 2. The user provides Nginxnginx execution path 3. Backup LOG path 4. How long to store log files or space days and GB units "###### #Nginx_Log_Dir ="/data/nginx/web1/"nginx_pid_file="/usr/local/nginx/sbin /nginx "nginx_bak_dir="/data/nginx/bak/web1 "nginx_bak_day=15nginx_bak_max_size=20###### #import os,time, Shutilimport tarfile,datetimetime_secs=time.strftime ("%y%m%d%h%m%s", Time.localtime ()) Time_Day=time . Strftime ("%y%m%d", Time.localtime ()) Possible_topdir=os.path.normpath (Os.path.abspath (nginx_log_dir)) Tmp_work_ File=os.path.normpath (Os.path.join (Possible_topdir, Time_secs)) def mvlog (sour_d Ir,desc_dir): Need_mv_file_list=os.listdir (Sour_dir) Os.mkdir (Desc_dir) for item in Need_mv_file_list:shu Til.move (Os.path.join (Sour_dir,item), Desc_dir) def reloadnginxlog (nginx_sbin): shell_command= "{0}-S Reopen ". Format (Nginx_sbin) if Os.system (Shell_command) ==0:PRint "Nginx Log has been reloaded" Def Tar_log_file (Log_path,tar_dir): Tar_bak_name=os.path.normpath (Os.path.join (Tar_dir, "web1_{0}.tar.gz". Format (Time_day)) Tar=tarfile.open (Tar_bak_name, "W:gz") tar.ad D (Log_path,arcname=os.path.basename (Log_path)) Tar.close () Shutil.rmtree (Log_path) def del_one_old_file (Del_File_ Dir,check_day=none): For root, dirs, files in Os.walk (Del_file_dir): Files.sort (Key=lambda fn:os.path.getctime ( Os.path.join (Root, fn))) if Check_day==true:old_file_time_day=datetime.datetime.fromtimestamp (Os.path.getctime ( Os.path.join (Root, Files[0])). strf Time ("%y%m%d") time_now=time.strftime ("%y%m%d", Time.localtime ()) S_day=int (Time_now)-int (old_file_time_day) Return S_day Else:os.remove (Os.path.normpath (Os.path.join (Root,files[0))) def check_ture_or_flase (nginx_ Bak_dir,bak_days,bak_size): Nginx_bak_dir = Os.path.normpath (nginx_bak_dir) Size = 0 for root, dirs, files in Os.walk (Nginx_bak_dir): Size + = SUM ([Os.path.getsize (Os.path.join (root, name)) for name in Files]) Mb_size = '%.2f '% float (size/1024.0/102 4.0) Mb_max_bak_size = '%.2f '% float (bak_size * 1024x768) Flat = Del_one_old_file (Nginx_bak_dir, True) > Bak_days or Float (mb_size) > float (mb_max_bak_size) return flatdef Check_bak_dir (nginx_bak_dir,bak_days,bak_size): Flat=chec K_ture_or_flase (nginx_bak_dir,bak_days,bak_size) while Flat:del_one_old_file (nginx_bak_dir) Flat = Check _ture_or_flase (Nginx_bak_dir, Bak_days, bak_size) if Flat==false:breakif __name__== "__main__": " Mvlog move the current file reloadnginxlog the new load nginx log tar_log_file package log files Check_bak_dir Check the log backup directory, whether you need to delete the backup log ' Mvlog (POS Sible_topdir,tmp_work_file) Reloadnginxlog (nginx_pid_file) tar_log_file (tmp_work_file,nginx_bak_dir) Check_Bak_Di R (Nginx_bak_dir,nginx_baK_day,nginx_bak_max_size)
Nginx cut log script (Python)