Python log print and write concurrency simple version implementation

Source: Internet
Author: User
Tags flock posix

We generally use logging log printing, but logging is thread-safe, multi-process also has a lot of introduction, introduce some file lock, to logging well configuration, can support.

However, by testing, it is easy to find multiple processes that are prone to duplicate file writes or to print a normal file.

My log requirements are relatively simple, able to distinguish between files, the correct writing log file.

The file lock is introduced, the log write function is encapsulated into an operation _logger class, and the log name and write level are encapsulated into a business class logger.

This example is based on the Python3 implementation. This example 20 processes concurrently, write 3 files each, each file writes more than 100 rows of data, there is no data redundancy in the log file, there is no data missing.

#-*-coding:utf-8-*-
"""
Author:yinshunyao
DATE:2017/3/5 0005 10:50
"""
# import Logging
Import OS
Import time
# Use third-party system locks for file locking and unlocking
if Os.name = = ' NT ':
Import Win32con, Win32file, pywintypes
LOCK_EX = Win32con. Lockfile_exclusive_lock
lock_sh = 0 # The default value
LOCK_NB = Win32con. lockfile_fail_immediately
__overlapped = Pywintypes. OVERLAPPED ()


def lock (file, flags):
hfile = Win32file._get_osfhandle (File.fileno ())
Win32file. LockFileEx (hfile, flags, 0, 0xffff0000, __overlapped)

def unlock (file):
hfile = Win32file._get_osfhandle (File.fileno ())
Win32file. Unlockfileex (hfile, 0, 0xffff0000, __overlapped)

elif Os.name = = ' POSIX ':
From fcntl import lock_ex


def lock (file, flags):
Fcntl.flock (File.fileno (), flags)

def unlock (file):
Fcntl.flock (File.fileno (), Fcntl. Lock_un)
Else
Raise RuntimeError ("File Locker only support NT and Posix platforms!")



Class _logger:
File_path = ' '
 #初始化日志路径 

@staticmethod
def init ():

If not _logger.file_path:
_log Ger.file_path = '%s/log '% Os.path.abspath (Os.path.dirname (__file__))
return True
 

@staticmethod
def _write (Messge, file_name):
If not messge:
return true< br> Messge = messge.replace (' \ t ', ', ')
File = ' {}/{} '. Format (_logger.file_path, file_name)
WH Ile True:
Try:
F = open (file, ' A + ')
Lock (F, LOCK_EX)
Break
Except:
Time.sleep (0.01)
Continue

# ensure buffer content writes To file
while True:
Try:
F.write (messge + ' \ n ')
F.flush ()
Break
Except:
Time.sleep (0.01)
Continue

While True:
Try:
Unlock (f)
F.close ()
return true< br> except:
Time.sleep (0.01)
Continue
    @staticmethod
def write (message, file_name, Only_print=false):
If not _logger.init (): Return
Print (message)
If not only_print:
_logger._write (message, file_name)



Class Logger:
def __init__ (self, logger_name, file_name= "):
Self.logger_name = Logger_name
Self.file_name = file_name
    # Generate messages based on message level, custom format,

def _build_message (self, message, level):
Try
Return ' [%s]\t[%5s]\t[%8s]\t%s ' \
% (Time.strftime ('%y-%m-%d%h:%m:%s '), level, self.logger_name, message)
Except Exception as E:
Print (' Parse log message exception: {} '. Format (e))
Return '


Def warning (self, message):
_logger.write (self._build_message (message, ' WARN '), Self.file_name)

Def warn (self, message):
_logger.write (self._build_message (message, ' WARN '), Self.file_name)

def error (self, message):
_logger.write (self._build_message (message, ' ERROR '), Self.file_name)

def info (self, message):
_logger.write (self._build_message (message, ' INFO '), Self.file_name, True)

def debug (self, message):
_logger.write (self._build_message (message, ' DEBUG '), Self.file_name)
# Cyclic print log test function


def _print_test (count):
Logger = Logger (logger_name= ' test{} '. Format (count), File_name= ' test{} '. Format (count% 3))
Key = 0
While True:
Key + = 1
# print (' {}-{} '. Format (logger, key))
Logger.debug ('%d '% key)
Logger.error ('%d '% key)


if __name__ = = ' __main__ ':
From multiprocessing import Pool, Freeze_support
Freeze_support ()
    # Process Pool to test
Pool = Pool (processes=20)
Count = 0
While Count < 20:
Count + = 1
Pool.apply_async (Func=_print_test, args= (count,))
Else
Pool.close ()
Pool.join ()

Python log print and write concurrency simple version implementation

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.