Last updated 14 days ago

Send Python logs to Timber

Timber integrates with Python through the timber Python library, enabling you to send Python logs to your Timber account.



  1. Install the timber library:

    pip install timber
  2. Install the Timber logger, replace YOUR_API_KEY and YOUR_SOURCE_ID accordingly:

    import logging
    import timber
    logger = logging.getLogger(__name__)
    # Set to logging.DEBUG if you want all logs
    timber_handler = timber.TimberHandler(source_id='YOUR_SOURCE_ID', api_key='YOUR_API_KEY')


The TimberHandler takes a variety of parameters that allow for fine-grained control over its behavior.


Like any other logger.Handler, the TimberHandler can be configured to only respond to log events of a specific level:

# Only respond to events as least as important as `warning`
timber_handler = timber.TimberHandler(api_key='...', level=logging.WARNING)

You can also set the level as a whole on the logger itself:


buffer_capacity and flush_interval

Timber buffers log events and sends them in the background for maximum performance. All outstanding log events are sent when the buffer is full or a certain amount of time has passed since any events were sent. To control the size of the buffer, pass the buffer_capacity argument:

# Never allow more than 50 outstanding log events
timber_handler = timber.TimberHandler(api_key='...', buffer_capacity=50)

To control the maximum amount of time between buffer flushes, pass the flush_interval argument:

# Send any outstanding log events at most every 60 seconds
timber_handler = timber.TimberHandler(api_key='...', flush_interval=60)


Logging should never break your application, which is why the TimberHandler suppresses all internal exceptions by default. To change this behavior:

# Allow exceptions from internal log handling to propagate to the application,
# instead of suppressing them.
timber_handler = timber.TimberHandler(api_key='...', raise_exceptions=True)


As soon as the internal log event buffer is full, Timber flushes all of the events to the server, but while that occurs any incoming log events are dropped by default. To make your application block in this case to ensure that all log statements are sent to Timber:

# Make log statements block until the internal log event buffer is no longer full.
timber_handler = timber.TimberHandler(api_key='...', drop_extra_events=False)


By default all TimberHandler instances use the same context object (timber.context), but if you'd like to use multiple loggers and multiple handlers, each with a different context, it is possible to explicitly create and pass your own:

import logging
import timber
logger = logging.getLogger(__name__)
context = timber.TimberContext()
timber_handler = timber.TimberHandler(api_key='...', context=context)
with context(job={'id': 123}):
logger.critical('Background job execution started')
# ... code here
logger.critical('Background job execution completed')


Basic Logging

Use the Python Logger as usual:'Info message')

Structured Logging

If you haven't already, please read our structured logging best practices guide.

logger.debug('Order #1234 placed, total: $520.23', extra={
'order_placed': {
'id': 1234,
'total': 520.23

Setting Context

Add shared structured data across multiple log statements:

with timber.context(job={'id': 123}):'Background job execution started')
# ... code here'Background job execution completed')

Contexts nest and merge naturally:

with timber.context(job={'id': 123, 'count': 1}):
# Sends a context {'job': {'id': 123, 'count': 1}}'Background job execution started')
# ... code here
with timber.context(job={'count': 2}):
# Sends a context {'job': {'id': 123, 'count': 2}}'Background job in progress')
# ... code here
# Sends a context {'job': {'id': 123, 'count': 1}}'Background job execution completed')


Extreme care was taken into the design of timber to be fast and reliable:

  1. timber works directly with the Python logging library, automatically assuming all of the stability and performance benefits this provides.

  2. Log data is buffered and flushed on an interval to optimize performance and delivery.

  3. The timber HTTP backend uses a controlled multi-buffer design to efficiently ship data to the Timber service.

  4. Connections are re-used and rotated to ensure efficient delivery of log data.

  5. Delivery failures are retried with an exponential backoff, maximizing successful delivery.

  6. Msgpack is used for payload encoding for its superior performance and memory management.

  7. The Timber service ingest endpoint is a HA service designed to handle extreme fluctuations of volume, it responds in under 50ms to reduce back pressure.


To begin, please see our log delivery troubleshooting guide. This covers the most common issues we see with log delivery:

If the above troubleshooting guide does not resolve your issue then we recommend enabling the raise_exceptions option. This should surface any delivery errors.