Python logger module and pythonlogger Module

Source: Internet
Author: User

Python logger module and pythonlogger Module
1 Introduction to the logging Module

The logging module is a standard module built in Python. It is mainly used to output running logs. It can be used to set the log output level, log storage path, and log file rollback. Compared with print, it has the following advantages:

Logger is never directly instantiated and often uses the Module-Level Function (Module-Level Function) Logging. getLogger (name) is obtained. If the name is not specified, root is used.The name is named after a. B. c ). The same logger object is returned when you call the logging. getLogger () method for multiple logger objects with the same name. In this naming method, loggers are the sub-logger of the previous logger and automatically inherit the log information of the parent logger. Therefore, it is not necessary to configure all logger of an application, you only need to configure the logger at the top level, and then inherit the sub-logger as needed. Logging. the Logger object plays three roles: First, it exposes several methods to the application so that the application can write logs at runtime. secondly, the Logger object determines how to process log information based on the log information severity or filter object (the default filtering function ). finally, logger is also responsible for transmitting log information to the relevant handlers.2 logging module using 2.1 for basic use.

Configure the basic logging settings and output logs on the console,

import logginglogging.basicConfig(level = logging.INFO,format = '%(asctime)s - %(name)s - %(levelname)s - %(message)s')logger = logging.getLogger(__name__)logger.info("Start print log")logger.debug("Do something")logger.warning("Something maybe fail.")logger.info("Finish")

Console output at runtime,

1 2016-10-09 19:11:19,434 - __main__ - INFO - Start print log2 2016-10-09 19:11:19,434 - __main__ - WARNING - Something maybe fail.3 2016-10-09 19:11:19,434 - __main__ - INFO - Finish

In logging, you can select many message levels, such as DEBUG, INFO, WARNING, ERROR, and CRITICAL. by assigning different levels to logger or handler, developers can only output error information to a specific record file, or only record debugging information during debugging.

 

Change the logger level to DEBUG, and then observe the output result.

logging.basicConfig(level = logging.DEBUG,format = '%(asctime)s - %(name)s - %(levelname)s - %(message)s')

The output result shows that the debug log is output.

2016-10-09 19:12:08,289 - __main__ - INFO - Start print log2016-10-09 19:12:08,289 - __main__ - DEBUG - Do something2016-10-09 19:12:08,289 - __main__ - WARNING - Something maybe fail.2016-10-09 19:12:08,289 - __main__ - INFO - Finish
Parameters of the logging. basicConfig function:
Filename: Specify the log file name; filemode: it has the same meaning as the file function. Specify the log file opening mode, 'w' or 'A'; format: Specify the output format and content, format can output a lot of useful information, datefmt: specifies the time format, the same as time. strftime (); level: Set the log level. The default value is logging. WARNNING; stream: Specifies the output stream of logs. You can specify the output stream to sys. stderr, sys. stdout or file, which is output to sys by default. stderr. When stream and filename are both specified, stream is ignored;
Formatters defines the output format of Logger records. Defines the content format of the final log information. The application can directly instantiate the Foamatter class. The Information Format String is replaced by a % (<dictionary key>) s string.
Attribute name Format Description
Name % (Name) s Log name
Asctime % (Asctime) s Readable time. The default format is '2017-07-08 16:49:45, 123456'. The value is millisecond after a comma.
Filename % (Filename) s File Name, part of pathname
Pathname % (Pathname) s File full path name
FuncName % (FuncName) s Method names corresponding to multiple call logs
Levelname % (Levelname) s Log Level
Levelno % (Levelno) s Digital Log Level
Lineno % (Lineno) d Number of lines in the source code of the recorded log
Module % (Module) s Module name
Msecs % (Msecs) d Millisecond in time
Process % (Process) d Process ID
ProcessName % (ProcessName) s Process name
Thread % (Thread) d Thread ID
ThreadName % (ThreadName) s Thread name
RelativeCreated % (RelativeCreated) d Relative time of log creation, in milliseconds
2.2 write logs to files

2.2.1 write logs to files

Set logging, create a FileHandler, set the format of the Output Message, add it to the logger, and then write the log to the specified file,

import logginglogger = logging.getLogger(__name__)logger.setLevel(level = logging.INFO)handler = logging.FileHandler("log.txt")handler.setLevel(logging.INFO)formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')handler.setFormatter(formatter)logger.addHandler(handler)logger.info("Start print log")logger.debug("Do something")logger.warning("Something maybe fail.")logger.info("Finish")

The log data in log.txt is:

15:02:09, 905-_ main _-INFO-Start print log
15:02:09, 905-_ main _-WARNING-Something maybe fail.
15:02:09, 905-_ main _-INFO-Finish

2.2.2 output logs to both the screen and log files

Add StreamHandler to logger to output logs to the screen,

import logginglogger = logging.getLogger(__name__)logger.setLevel(level = logging.INFO)handler = logging.FileHandler("log.txt")handler.setLevel(logging.INFO)formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')handler.setFormatter(formatter)console = logging.StreamHandler()console.setLevel(logging.INFO)logger.addHandler(handler)logger.addHandler(console)logger.info("Start print log")logger.debug("Do something")logger.warning("Something maybe fail.")logger.info("Finish")

You can see in the log.txt file and console

15:03:05, 075-_ main _-INFO-Start print log
15:03:05, 075-_ main _-WARNING-Something maybe fail.
15:03:05, 075-_ main _-INFO-Finish

It can be found that logging has a main log processing object, and other processing methods are added through addHandler. The handler contained in logging mainly includes the following types,

Handler name: location; Role StreamHandler: logging. streamHandler; the log is output to the stream, which can be sys. stderr, sys. stdout or FileHandler: logging. fileHandler; log output to the file BaseRotatingHandler: logging. handlers. baseRotatingHandler; Basic log rollback method RotatingHandler: logging. handlers. rotatingHandler; log rollback method. It supports the maximum number of log files and Log File rollback TimeRotatingHandler: logging. handlers. timeRotatingHandler; log rollback method, which rolls back the log file SocketHandler: logging within a certain period of time. handlers. socketHandler; remotely outputs logs to TCP/IP socketsDatagramHandler: logging. handlers. datagramHandler; remotely outputs logs to UDP socketsSMTPHandler: logging. handlers. SMTPHandler; remotely output logs to the email address SysLogHandler: logging. handlers. sysLogHandler; logs are output to syslogNTEventLogHandler: logging. handlers. NTEventLogHandler; remote output log to Windows NT/2000/XP Event Log MemoryHandler: logging. handlers. memoryHandler; Specify bufferHTTPHandler for log output to memory: logging. handlers. HTTPHandler; remote output to the HTTP server through "GET" or "POST"

 

2.2.3 log rollback

You can use RotatingFileHandler to roll back logs,

Import loggingfrom logging. handlers import RotatingFileHandlerlogger = logging. getLogger (_ name _) logger. setLevel (level = logging. INFO) # define a RotatingFileHandler and back up to three log files. Each log file can be 1 KrHandler = RotatingFileHandler ("log.txt", maxBytes = 1*1024, backupCount = 3) rHandler. setLevel (logging. INFO) formatter = logging. formatter ('% (asctime) s-% (name) s-% (levelname) s-% (message) s') rHandler. setFormatter (formatter) console = logging. streamHandler () console. setLevel (logging. INFO) console. setFormatter (formatter) logger. addHandler (rHandler) logger. addHandler (console) logger.info ("Start print log") logger. debug ("Do something") logger. warning ("Something maybe fail. ") logger.info (" Finish ")

You can see the log files backed up in the project directory,

. 3 set the message level

Different log levels can be set to control log output.

Log Level: use range: fatal error critical: especially bad things, such as memory depletion, empty disk space, usually seldom use ERROR: When an ERROR occurs, for example, IO operation failure or connection problem WARNING: when a very important event occurs, but it is not an error, such as the User Login Password error INFO: processing the request or status change and other daily transactions DEBUG: use the DEBUG level during debugging, such as the intermediate status of each loop in the algorithm

 SetLevel(Lvl) Defines the minimum level for processing logs. the built-in level is:DEBUG, INFO, WARNING, ERROR, CRITICAL; is the value corresponding to the level

2.4 capture traceback

The traceback module in Python is used to trace abnormal return information. You can record traceback in logging.

import logginglogger = logging.getLogger(__name__)logger.setLevel(level = logging.INFO)handler = logging.FileHandler("log.txt")handler.setLevel(logging.INFO)formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')handler.setFormatter(formatter)console = logging.StreamHandler()console.setLevel(logging.INFO)logger.addHandler(handler)logger.addHandler(console)logger.info("Start print log")logger.debug("Do something")logger.warning("Something maybe fail.")try:    open("sklearn.txt","rb")except (SystemExit,KeyboardInterrupt):    raiseexcept Exception:    logger.error("Faild to open sklearn.txt from logger.error",exc_info = True)logger.info("Finish")

Output from the console and Log File log.txt

1 15:04:24, 045-_ main _-INFO-Start print log2 15:04:24, 045-_ main _-WARNING-Something maybe fail.3 15:04:24, 046-_ main _-ERROR-Faild to open sklearn.txt from logger. error4 Traceback (most recent call last): 5 File "E: \ PYTHON \ Eclipse \ eclipse \ Doc \ 14day5 \ Logger module \ Logging. py ", line 71, in <module> 6 open (" sklearn.txt "," rb ") 7 IOError: [Errno 2] No such file or directory: 'sklearn.txt '8 15:04:24, 049-_ main _-INFO-Finish
View Code

 

You can also use logger. exception (msg, _ args), which is equivalent to logger. error (msg, exc_info = True, _ args ),

Replace logger. error ("Faild to open sklearn.txt from logger. error", exc_info = True) with, logger. exception ("Failed to open sklearn.txt from logger. exception ")

 

2.5 use logging for multiple modules

Main module mainModule. py

import loggingimport subModulelogger = logging.getLogger("mainModule")logger.setLevel(level = logging.INFO)handler = logging.FileHandler("log.txt")handler.setLevel(logging.INFO)formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')handler.setFormatter(formatter)console = logging.StreamHandler()console.setLevel(logging.INFO)console.setFormatter(formatter)logger.addHandler(handler)logger.addHandler(console)logger.info("creating an instance of subModule.subModuleClass")a = subModule.SubModuleClass()logger.info("calling subModule.subModuleClass.doSomething")a.doSomething()logger.info("done with  subModule.subModuleClass.doSomething")logger.info("calling subModule.some_function")subModule.som_function()logger.info("done with subModule.some_function")

SubModule. py

import loggingmodule_logger = logging.getLogger("mainModule.sub")class SubModuleClass(object):    def __init__(self):        self.logger = logging.getLogger("mainModule.sub.module")        self.logger.info("creating an instance in SubModuleClass")    def doSomething(self):        self.logger.info("do something in SubModule")        a = []        a.append(1)        self.logger.debug("list a = " + str(a))        self.logger.info("finish something in SubModuleClass")def som_function():    module_logger.info("call function some_function")

Output in control and Log File log.txt after execution

1 2017-07-25 15:05:07,427 - mainModule - INFO - creating an instance of subModule.subModuleClass2 2017-07-25 15:05:07,427 - mainModule.sub.module - INFO - creating an instance in SubModuleClass3 2017-07-25 15:05:07,427 - mainModule - INFO - calling subModule.subModuleClass.doSomething4 2017-07-25 15:05:07,427 - mainModule.sub.module - INFO - do something in SubModule5 2017-07-25 15:05:07,427 - mainModule.sub.module - INFO - finish something in SubModuleClass6 2017-07-25 15:05:07,427 - mainModule - INFO - done with  subModule.subModuleClass.doSomething7 2017-07-25 15:05:07,427 - mainModule - INFO - calling subModule.some_function8 2017-07-25 15:05:07,427 - mainModule.sub - INFO - call function some_function9 2017-07-25 15:05:07,428 - mainModule - INFO - done with subModule.some_function
View Code

 

Note:

First, define the logger 'mainmodule 'in the main module and configure it. Then, you can use getLogger ('mainmodule') elsewhere in the interpreter process ') the obtained objects are the same and can be used directly without reconfiguration. The defined logger sub-logger can share the definition and configuration of the parent logger. The so-called Parent and Child logger are identified by naming, any logger starting with 'mainmodule' is its sub-logger, for example, 'mainmodule. sub '.

To develop an application, you can first compile the configuration corresponding to this application through the logging configuration file, and generate a root logger, such as 'pythonapp ', then, load the logging configuration through fileConfig in the main function. Then, you can use the logger sub-logger of the root logger, such as 'pythonapp. core', 'pythonapp. web, without the need to repeatedly define and configure the logger of each module.

 

3. Configure the logging module through the JSON or YAML File

Although logging can be configured in Python code, it is not flexible enough. The best way is to use a configuration file for configuration. In Python 2.7 and later versions, you can load the logging configuration from the dictionary, which means you can load the log configuration through the JSON or YAML file.

3.1 configure through JSON File

JSON configuration file

{    "version":1,    "disable_existing_loggers":false,    "formatters":{        "simple":{            "format":"%(asctime)s - %(name)s - %(levelname)s - %(message)s"        }    },    "handlers":{        "console":{            "class":"logging.StreamHandler",            "level":"DEBUG",            "formatter":"simple",            "stream":"ext://sys.stdout"        },        "info_file_handler":{            "class":"logging.handlers.RotatingFileHandler",            "level":"INFO",            "formatter":"simple",            "filename":"info.log",            "maxBytes":"10485760",            "backupCount":20,            "encoding":"utf8"        },        "error_file_handler":{            "class":"logging.handlers.RotatingFileHandler",            "level":"ERROR",            "formatter":"simple",            "filename":"errors.log",            "maxBytes":10485760,            "backupCount":20,            "encoding":"utf8"        }    },    "loggers":{        "my_module":{            "level":"ERROR",            "handlers":["info_file_handler"],            "propagate":"no"        }    },    "root":{        "level":"INFO",        "handlers":["console","info_file_handler","error_file_handler"]    }}

Load the configuration file in JSON format, and configure logging through logging. dictConfig,

import jsonimport logging.configimport osdef setup_logging(default_path = "logging.json",default_level = logging.INFO,env_key = "LOG_CFG"):    path = default_path    value = os.getenv(env_key,None)    if value:        path = value    if os.path.exists(path):        with open(path,"r") as f:            config = json.load(f)            logging.config.dictConfig(config)    else:        logging.basicConfig(level = default_level)def func():    logging.info("start func")    logging.info("exec func")    logging.info("end func")if __name__ == "__main__":    setup_logging(default_path = "logging.json")    func()

  

3.2 configure through YAML File

Configuration through the YAML file, which looks more straightforward than JSON,

version: 1disable_existing_loggers: Falseformatters:        simple:            format: "%(asctime)s - %(name)s - %(levelname)s - %(message)s"handlers:    console:            class: logging.StreamHandler            level: DEBUG            formatter: simple            stream: ext://sys.stdout    info_file_handler:            class: logging.handlers.RotatingFileHandler            level: INFO            formatter: simple            filename: info.log            maxBytes: 10485760            backupCount: 20            encoding: utf8    error_file_handler:            class: logging.handlers.RotatingFileHandler            level: ERROR            formatter: simple            filename: errors.log            maxBytes: 10485760            backupCount: 20            encoding: utf8loggers:    my_module:            level: ERROR            handlers: [info_file_handler]            propagate: noroot:    level: INFO    handlers: [console,info_file_handler,error_file_handler]

Load the configuration file through YAML, and then configure logging through logging. dictConfig

import yamlimport logging.configimport osdef setup_logging(default_path = "logging.yaml",default_level = logging.INFO,env_key = "LOG_CFG"):    path = default_path    value = os.getenv(env_key,None)    if value:        path = value    if os.path.exists(path):        with open(path,"r") as f:            config = yaml.load(f)            logging.config.dictConfig(config)    else:        logging.basicConfig(level = default_level)def func():    logging.info("start func")    logging.info("exec func")    logging.info("end func")if __name__ == "__main__":    setup_logging(default_path = "logging.yaml")    func()    

  

4 Reference

Http://wjdadi-gmail-com.iteye.com/blog/1984354

Some things about logging: Repeated log writing by python logging

 

From: http://www.cnblogs.com/zhbzz2007/p/5943685.html

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.