Objective
It is believed that every programmer should know that when using Python to write background tasks, it is often necessary to use an output log to record the state of the program running, and to save the details of the error in case of an error to debug and analyze. The Python logging module is a good helper in this case.
The logging module can specify the level of logging, Debug, INFO, WARNING, ERROR, CRITICAL, for example, when you are developing and debugging, you can output the logs above debug level, and in a production environment, only the INFO level is output. (The default level is warning if not specifically specified)
Logging can also specify output to the command line or file, and can also split the log file by time or size.
For the detailed use of logging, here is no longer in detail, you can refer to the official document, or the introduction here.
Configuration of logging
Normally, we need to save the log to a file and expect to automatically split the file, avoiding the log file being too large. A logging configuration example is given below.
Logging.config.dictConfig ({
' Disable_existing_loggers ': True,
' Format ': [% (asctime) s]% (levelname) s [% (name) s:% (Lineno) s]% (message) s ",
' Datefmt ': "%y-%m-%d%h:%m:%s"
' Format ': '% (levelname) s% (message) s '
' class ': ' Logging. Nullhandler ',
' class ': ' Logging. Streamhandler ',
' Formatter ': ' Verbose '
' class ': ' Logging. Rotatingfilehandler ',
# split log when 10MB is reached
' MaxBytes ': 1024*1024*10,
# retain up to 50 documents
# then file opening is deferred until the ' the ' the ' the ' the ' the ' the ' the '.
' filename ': ' Logs/mysite.log ',
' Formatter ': ' Verbose '
' Handlers ': [' file '],
|
We're in a module, so we can use that to log logs.
Importlogging
Logger=logging.getlogger (__name__)
if__name__== ' __main__ ':
Logger.info (' Log info ')
|
Use in a multi-process environment
According to the official documentation, logging is thread-safe, which means that multiple threads in one process simultaneously write logs to the same file are safe. But (yes, there's a but) multiple processes writing to the same file log is not safe. The official version of the story is this:
Because there is no standard way to serialize access to a single file across multiple processes in Python. If you are need to log to a single file from multiple processes, one way of doing this are to have all processes log to a S Ockethandler, and have a separate process which implements a socket server which reads from the socket and logs to file. (If you are prefer, you can dedicate one thread into one of the existing processes to perform this function.)
Some people will say, then I do not have to do more than the process. But Python has a Gil's big lock (see here for the Gil's entanglements), using multithreading is no use for multi-core CPUs, and in most cases using multiple processes to take advantage of multi-core CPUs, so we're still not open to multiple process logs.
To solve this problem, you can use Concurrentloghandler,concurrentloghandler to securely write logs to the same file in a multiple-process environment, and to split the log file when the log file reaches a certain size. In the default logging module, there is a Timedrotatingfilehandler class, you can split the log file by time, unfortunately Concurrentloghandler does not support the way to split the log file by time.
Re-modify the class in the next handlers.
Logging.config.dictConfig ({
# If a concurrent log processing class is not used, the log appears to be missing in multiple instances
' Class ': ' Cloghandler. Concurrentrotatingfilehandler ',
# split log when 10MB is reached
' MaxBytes ': 1024*1024*10,
# retain up to 50 documents
# then file opening is deferred until the ' the ' the ' the ' the ' the ' the ' the '.
' filename ': ' Logs/mysite.log ',
' Formatter ': ' Verbose '
|
After running, you can find that a. lock file is automatically created and the log file is securely written by means of a lock.
--> -->