Many applications have log modules, which are used to record some key information about the system during its operation, so as to track the running status of the system. This article mainly introduces the use of the logging log module in a multi-process environment in Python. For more information, see the following. Many applications have log modules, which are used to record some key information about the system during its operation, so as to track the running status of the system. This article mainly introduces the use of the logging log module in a multi-process environment in Python. For more information, see the following.
Preface
I believe that every programmer should know that when using Python to write background tasks, it is often necessary to use output logs to record the running status of the program and save the error details when an error occurs, skip debugging and analysis. The logging module of Python is a good helper in this case.
The logging module can specify the log level, such as DEBUG, INFO, WARNING, ERROR, and CRITICAL. for example, you can output logs at or above the DEBUG level during development and debugging, in the production environment, only the INFO level is output. (If not specified, the default level is warning)
Logging can also specify the output to the command line or file, and can also split log files on time or by size.
For more information about how to use logging, see the official documentation or here.
Logging configuration
In general, we need to save the log to the file and expect to automatically split the file to avoid the log file being too large. The following is an example of logging configuration.
Import logging. config logging. config. dictConfig ({'version': 1, 'Disable _ existing_loggers ': True, 'formatters': {'verbose ': {'format': "[% (asctime) s] % (levelname) s [% (name) s: % (lineno) s] % (message) s ", 'datefmt ': "% Y-% m-% d % H: % M: % S"}, 'simple': {'format': '% (levelname) s % (message) s '},}, 'handlers': {'null': {'level': 'debug', 'class': 'Logging. nullHandler ',}, 'console': {'level': 'debug', 'class': 'Logging. streamHandler ', 'formatter': 'verbose '}, 'file': {'level': 'debug', 'class': 'Logging. rotatingFileHandler ', # 'maxbytes': 1024*1024*10 when the log reaches 10 MB, # up to 50 files can be retained 'backcount': 50, # If delay is true, # then file opening is deferred until the first call to emit (). 'delay': True, 'filename': 'logs/mysite. log', 'formatter ': 'verbose'}, 'loggers': {'': {'handlers': ['file'], 'level ': 'info ',},}})
We can use this in a module to record logs.
import logginglogger = logging.getLogger(__name__) if __name__ == '__main__': logger.info('log info')
Use in a multi-process environment
According to the official documents, logging is thread-safe. that is to say, it is safe to write logs to the same file simultaneously by multiple threads in a process. However, it is not safe for multiple processes to write logs to the same file. The official statement is as follows:
Because there is no standard way to serialize access to a single file using SS multiple processes in Python. if you need to log to a single file from multiple processes, one way of doing this is to have all the processes log to a SocketHandler, and have a separate process which implements a socket server which reads from the socket and logs to file. (If you prefer, you can dedicate one thread in one of the existing processes to perform this function .)
Some may say that I don't need to use multiple processes. However, Python has a big GIL lock (for details about GIL, we can see here). multi-threaded CPUs cannot be used. In most cases, multi-process CPUs are used instead, therefore, we still cannot solve the problem of logging in multiple processes.
To solve this problem, you can use ConcurrentLogHandler. ConcurrentLogHandler can securely write logs to the same file in a multi-process environment, and can split the log file when the log file reaches a specific size. In the default logging module, there is a TimedRotatingFileHandler class that can split log files by time. Unfortunately, ConcurrentLogHandler does not support this time-based log file splitting method.
Modify the class in handlers.
Logging. config. dictConfig ({... 'handlers': {'file': {'level': 'debug', # if no concurrent log processing class is used, when there are multiple instances, the log will be missing 'class': 'cloghandler. concurrentRotatingFileHandler ', # 'maxbytes': 1024*1024*10 when the log reaches 10 MB, # up to 50 files are retained 'backcount': 50, # If delay is true, # then file opening is deferred until the first call to emit (). 'delay': True, 'filename': 'logs/mysite. log', 'formatter ': 'verbose '}},...})
After running, you can find that A. lock file is automatically created to securely write log files by locking the file.
For more details about how to use the logging log module in a multi-process environment in Python, please follow the PHP Chinese network!