You may find yourself wanting to add or supplement the built-in loggers so that Dagster logs are integrated with the rest of your log aggregation and monitoring infrastructure.
For example, you may be operating in a containerized environment where container stdout is aggregated by a tool such as Logstash. In this kind of environment, where logs will be aggregated and parsed by machine, the multi-line output from the default colored console logger is unhelpful. Instead, we'd much prefer to see single-line, structured log messages like:
Loggers are defined internally using the LoggerDefinition class, but, following a common pattern in the Dagster codebase, the @logger decorator exposes a simpler API for the common use case and is typically what you'll use to define your own loggers.
The decorated function should take a single argument, the init_context available during logger initialization, and return a logging.Logger:
@logger({"log_level": Field(str, is_required=False, default_value="INFO"),"name": Field(str, is_required=False, default_value="dagster"),},
description="A JSON-formatted console logger",)defjson_console_logger(init_context):
level = init_context.logger_config["log_level"]
name = init_context.logger_config["name"]
klass = logging.getLoggerClass()
logger_ = klass(name, level=level)
handler = logging.StreamHandler()classJsonFormatter(logging.Formatter):defformat(self, record):return json.dumps({
k: v
for k, v in record.__dict__.items()# values for these keys are not directly JSON-serializableif k notin["dagster_event","dagster_meta"]})
handler.setFormatter(JsonFormatter())
logger_.addHandler(handler)return logger_
@opdefhello_logs(context: OpExecutionContext):
context.log.info("Hello, world!")@job(logger_defs={"my_json_logger": json_console_logger})defdemo_job():
hello_logs()
You can specify the logger name in the run config. It also takes a config argument, representing the config that users can pass to the logger. For example: