Logging Module
Overview
The Logging module is a wrapper for Python’s built in logging module. Due to the challenges of multiprocessing logging, each process logs to its own log file, and all the log files for one run are logged to a subdirectory in the logs directory. This document outlines the usage and design process for the Logging module.
Usage
Each process wishing to log must import the custom logger module in common:
Logger creation should be done in the create method of a module:
result, self.__logger = logger.logger.create("name", True)
if not result:
return False, None
Replace “name” with the desired name of the logger - this will also be the name of the log_file. It is suggested to name the logger after the process doing the logging. The second argument is a boolean indicating if we should log to a file.
The logger supports the five default logging levels, in increasing order of severity: debug
, info
, warning
, error
, critical
.
The logger takes 2 inputs: the log message and whether or not to include frame information. Frame information is metadata about the code being run - we extract the file, function, and line of the code.
To log a message, use the following template, replace debug
with the desired log level:
self._logger.debug("log message", True)
Design Choice
There are several challenges with multiprocessing logging:
Using the default
logging
module:If multiple processes attempt to log to the same file, it will corrupt the file
We cannot call a global logger directly from a subprocess
Since we are in a multiprocessing environment, calling
get_logger
() with the name of the desired logger does not give us the desired logger, it creates a new one
Since we cannot have a global logger, one solution is to initialize a logger in each process, but this introduces a bug
If a logger is initialized in each process, each time the file handler is set, it overwrites the existing file
e.g. if
main
initializes a logger to log toexample.log
, thenvideo_input
initializes a logger to log to log to also log toexample.log
, it erases any logs already made bymain
Another solution is to pass a logger to each process, but it is undesirable to change the function signatures of every module wishing to log to include the logger in the input
Using a custom module:
The custom module would have a logging worker which is started in main along with the other workers
It would have an input queue of log messages and other information (frame and log level)
This solves all the bugs above, however, we still need to pass the queue into every module wishing to log, which is again undesirable
Further reading on multiprocessing logging: