structlog

Compare original and translation side by side

🇺🇸

Original

English
🇨🇳

Translation

Chinese

structlog - Quick Reference

structlog - 快速参考

When to Use This Skill

何时使用本技能

  • Structured logging in Python
  • Integration with JSON logging
  • Context binding for request tracing
Deep Knowledge: Use
mcp__documentation__fetch_docs
with technology:
structlog
for comprehensive documentation.
  • Python场景下的结构化日志
  • JSON日志集成
  • 用于请求追踪的上下文绑定
深入了解:调用
mcp__documentation__fetch_docs
并指定技术为
structlog
,即可获取完整文档。

Basic Setup

基础安装

bash
pip install structlog
bash
pip install structlog

Essential Patterns

核心使用模式

Basic Configuration

基础配置

python
import structlog

structlog.configure(
    processors=[
        structlog.stdlib.filter_by_level,
        structlog.stdlib.add_logger_name,
        structlog.stdlib.add_log_level,
        structlog.stdlib.PositionalArgumentsFormatter(),
        structlog.processors.TimeStamper(fmt="iso"),
        structlog.processors.StackInfoRenderer(),
        structlog.processors.format_exc_info,
        structlog.processors.UnicodeDecoder(),
        structlog.processors.JSONRenderer()
    ],
    wrapper_class=structlog.stdlib.BoundLogger,
    context_class=dict,
    logger_factory=structlog.stdlib.LoggerFactory(),
    cache_logger_on_first_use=True,
)
python
import structlog

structlog.configure(
    processors=[
        structlog.stdlib.filter_by_level,
        structlog.stdlib.add_logger_name,
        structlog.stdlib.add_log_level,
        structlog.stdlib.PositionalArgumentsFormatter(),
        structlog.processors.TimeStamper(fmt="iso"),
        structlog.processors.StackInfoRenderer(),
        structlog.processors.format_exc_info,
        structlog.processors.UnicodeDecoder(),
        structlog.processors.JSONRenderer()
    ],
    wrapper_class=structlog.stdlib.BoundLogger,
    context_class=dict,
    logger_factory=structlog.stdlib.LoggerFactory(),
    cache_logger_on_first_use=True,
)

Basic Usage

基础用法

python
import structlog

log = structlog.get_logger()

log.info("user_logged_in", user_id=123, ip="192.168.1.1")
log.warning("rate_limit_exceeded", endpoint="/api/users", count=100)
log.error("database_error", error="connection timeout", retry=3)
python
import structlog

log = structlog.get_logger()

log.info("user_logged_in", user_id=123, ip="192.168.1.1")
log.warning("rate_limit_exceeded", endpoint="/api/users", count=100)
log.error("database_error", error="connection timeout", retry=3)

Context Binding

上下文绑定

python
log = structlog.get_logger()
python
log = structlog.get_logger()

Bind context for all subsequent logs

为后续所有日志绑定上下文

log = log.bind(request_id="abc-123", user_id=42)
log.info("processing_started") # Includes request_id and user_id log.info("step_completed", step=1) log.info("processing_finished")
log = log.bind(request_id="abc-123", user_id=42)
log.info("processing_started") # 会自动包含request_id和user_id log.info("step_completed", step=1) log.info("processing_finished")

New context

新建上下文

log = log.new(request_id="xyz-789")
undefined
log = log.new(request_id="xyz-789")
undefined

FastAPI Integration

FastAPI集成

python
from fastapi import FastAPI, Request
import structlog

app = FastAPI()

@app.middleware("http")
async def add_request_context(request: Request, call_next):
    structlog.contextvars.clear_contextvars()
    structlog.contextvars.bind_contextvars(
        request_id=request.headers.get("X-Request-ID", str(uuid.uuid4())),
        path=request.url.path,
    )
    return await call_next(request)
python
from fastapi import FastAPI, Request
import structlog

app = FastAPI()

@app.middleware("http")
async def add_request_context(request: Request, call_next):
    structlog.contextvars.clear_contextvars()
    structlog.contextvars.bind_contextvars(
        request_id=request.headers.get("X-Request-ID", str(uuid.uuid4())),
        path=request.url.path,
    )
    return await call_next(request)

Django Integration

Django集成

python
undefined
python
undefined

settings.py

settings.py

LOGGING = { "version": 1, "disable_existing_loggers": False, "formatters": { "json": { "()": structlog.stdlib.ProcessorFormatter, "processor": structlog.processors.JSONRenderer(), }, }, "handlers": { "console": { "class": "logging.StreamHandler", "formatter": "json", }, }, "root": { "handlers": ["console"], "level": "INFO", }, }
undefined
LOGGING = { "version": 1, "disable_existing_loggers": False, "formatters": { "json": { "()": structlog.stdlib.ProcessorFormatter, "processor": structlog.processors.JSONRenderer(), }, }, "handlers": { "console": { "class": "logging.StreamHandler", "formatter": "json", }, }, "root": { "handlers": ["console"], "level": "INFO", }, }
undefined

Exception Logging

异常日志记录

python
try:
    risky_operation()
except Exception:
    log.exception("operation_failed", operation="risky")
    # Automatically includes stack trace
python
try:
    risky_operation()
except Exception:
    log.exception("operation_failed", operation="risky")
    # 会自动包含堆栈追踪信息

When NOT to Use This Skill

何时不使用本技能

  • Simple scripts: Standard logging module is sufficient for basic needs
  • Legacy codebases: Migration effort may not be worth it for small projects
  • Text-only log requirements: structlog is JSON-first, requires parsing
  • Non-Python projects: Use language-appropriate logging frameworks
  • Applications without centralized logging: Standard logging may be simpler
  • 简单脚本:标准日志模块足以满足基础需求
  • 遗留代码库:小型项目的迁移成本可能高于收益
  • 仅需纯文本日志的需求:structlog优先输出JSON格式,需要额外解析才能使用
  • 非Python项目:请使用对应语言适配的日志框架
  • 无集中式日志的应用:标准日志使用起来可能更简单

Anti-Patterns

反模式

Anti-PatternWhy It's BadSolution
Using ConsoleRenderer in productionWastes CPU, not machine-parseableUse JSONRenderer for production
Not clearing context variablesLeaks context across requestsUse
structlog.contextvars.clear_contextvars()
Logging large objectsSerialization overheadLog only necessary fields or IDs
Creating new logger per requestPerformance overheadUse
logger.bind()
to add context
Missing exception loggingLoses stack tracesUse
logger.exception()
in except blocks
Not configuring processorsIncomplete/inconsistent outputConfigure full processor pipeline
反模式弊端解决方案
生产环境使用ConsoleRenderer浪费CPU资源,输出内容无法被机器解析生产环境使用JSONRenderer
不清理上下文变量上下文会在不同请求之间泄露调用
structlog.contextvars.clear_contextvars()
清理
记录大型对象产生额外的序列化开销仅记录必要的字段或ID
每次请求都新建logger实例产生性能开销使用
logger.bind()
来添加上下文
缺少异常日志记录丢失堆栈追踪信息在except代码块中使用
logger.exception()
未配置处理器输出内容不完整、格式不一致配置完整的处理器管道

Quick Troubleshooting

快速排查

IssueCauseSolution
Plain text output instead of JSONConsoleRenderer configuredChange to
JSONRenderer()
in processors
Context not appearing in logsNot using context bindingUse
logger.bind()
or
contextvars
Performance issuesToo many processorsRemove unnecessary processors, use JSONRenderer
Missing timestampsNo TimeStamper processorAdd
TimeStamper(fmt='iso')
to processors
Logs not colorized in devMissing dev configurationUse
ConsoleRenderer(colors=True)
for development
Context bleeding across requestsNot clearing contextvarsClear context at request start with middleware
问题原因解决方案
输出是纯文本而非JSON配置了ConsoleRenderer在处理器中更换为
JSONRenderer()
上下文字段未出现在日志中未使用上下文绑定功能使用
logger.bind()
contextvars
绑定上下文
性能问题配置了过多处理器移除不必要的处理器,使用JSONRenderer
日志缺少时间戳没有添加TimeStamper处理器在处理器中添加
TimeStamper(fmt='iso')
开发环境日志没有颜色缺少开发环境专属配置开发环境使用
ConsoleRenderer(colors=True)
上下文在不同请求之间泄露未清理contextvars在请求开始时通过中间件清理上下文