A tutorial on the use of logging log modules in Python in a multi-process environment

Source: Internet
Author: User

Objective

It is believed that every programmer should know that when using Python to write background tasks, it is often necessary to use an output log to record the state of the program running, and to save the details of the error in case of an error to debug and analyze. The Python logging module is a good helper in this case.

The logging module can specify the level of logging, Debug, INFO, WARNING, ERROR, CRITICAL, for example, when you are developing and debugging, you can output the logs above debug level, and in a production environment, only the INFO level is output. (The default level is warning if not specifically specified)

Logging can also specify output to the command line or file, and can also split the log file by time or size.

For the detailed use of logging, here is no longer in detail, you can refer to the official document, or the introduction here.

Configuration of logging

Normally, we need to save the log to a file and expect to automatically split the file, avoiding the log file being too large. A logging configuration example is given below.

Importlogging.config
Logging.config.dictConfig ({
' Version ': 1,
' Disable_existing_loggers ': True,
' formatters ': {
' Verbose ': {
' Format ': [% (asctime) s]% (levelname) s [% (name) s:% (Lineno) s]% (message) s ",
' Datefmt ': "%y-%m-%d%h:%m:%s"
},
' Simple ': {
' Format ': '% (levelname) s% (message) s '
},
},
' Handlers ': {
' null ': {
' Level ': ' DEBUG ',
' class ': ' Logging. Nullhandler ',
},
' Console ': {
' Level ': ' DEBUG ',
' class ': ' Logging. Streamhandler ',
' Formatter ': ' Verbose '
},
' file ': {
' Level ': ' DEBUG ',
' class ': ' Logging. Rotatingfilehandler ',
# split log when 10MB is reached
' MaxBytes ': 1024*1024*10,
# retain up to 50 documents
' Backupcount ': 50,
# If Delay is true,
# then file opening is deferred until the ' the ' the ' the ' the ' the ' the ' the '.
' Delay ': True,
' filename ': ' Logs/mysite.log ',
' Formatter ': ' Verbose '
}
},
' Loggers ': {
'': {
' Handlers ': [' file '],
' Level ': ' Info ',
},
}
})

We're in a module, so we can use that to log logs.

Importlogging
Logger=logging.getlogger (__name__)
if__name__== ' __main__ ':
Logger.info (' Log info ')

Use in a multi-process environment

According to the official documentation, logging is thread-safe, which means that multiple threads in one process simultaneously write logs to the same file are safe. But (yes, there's a but) multiple processes writing to the same file log is not safe. The official version of the story is this:

Because there is no standard way to serialize access to a single file across multiple processes in Python. If you are need to log to a single file from multiple processes, one way of doing this are to have all processes log to a S Ockethandler, and have a separate process which implements a socket server which reads from the socket and logs to file. (If you are prefer, you can dedicate one thread into one of the existing processes to perform this function.)

Some people will say, then I do not have to do more than the process. But Python has a Gil's big lock (see here for the Gil's entanglements), using multithreading is no use for multi-core CPUs, and in most cases using multiple processes to take advantage of multi-core CPUs, so we're still not open to multiple process logs.

To solve this problem, you can use Concurrentloghandler,concurrentloghandler to securely write logs to the same file in a multiple-process environment, and to split the log file when the log file reaches a certain size. In the default logging module, there is a Timedrotatingfilehandler class, you can split the log file by time, unfortunately Concurrentloghandler does not support the way to split the log file by time.

Re-modify the class in the next handlers.

Logging.config.dictConfig ({
...
' Handlers ': {
' file ': {
' Level ': ' DEBUG ',
# If a concurrent log processing class is not used, the log appears to be missing in multiple instances
' Class ': ' Cloghandler. Concurrentrotatingfilehandler ',
# split log when 10MB is reached
' MaxBytes ': 1024*1024*10,
# retain up to 50 documents
' Backupcount ': 50,
# If Delay is true,
# then file opening is deferred until the ' the ' the ' the ' the ' the ' the ' the '.
' Delay ': True,
' filename ': ' Logs/mysite.log ',
' Formatter ': ' Verbose '
}
},
...
})

After running, you can find that a. lock file is automatically created and the log file is securely written by means of a lock.

--> -->

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.