Distributed Python

Build your Celery (the distributed task queue, not the vegetable) skills with weekly tutorials,
how-tos and articles about common gotchas.

October 01, 2018

Estimated reading time: 4 minutes

Celery, docker and the missing startup banner

Posted by Bjoern Stiel

All source code examples used in this blog post can be found on GitHub:
https://github.com/ZoomerAnalytics/celery-docker-missing-banner

Have you ever dockerized your Celery app and wondered where the Celery worker bannker in the docker log output went? Usually, when you start your Celery worker, it comes up with a startup screen that displays useful information like version, broker, result backend, concurrency and subscribed queues.

 -------------- celery@cheetah v4.2.1 (windowlicker)
---- **** -----
--- * ***  * -- Linux-4.9.93-linuxkit-aufs-x86_64-with-debian-9.5 2018-09-29 15:21:28
-- * - **** ---
- ** ---------- [config]
- ** ---------- .> app:         __main__:0x22cf0c047b8
- ** ---------- .> transport:   redis://localhost:6379/0
- ** ---------- .> results:     redis://localhost:6379/0
- *** --- * --- .> concurrency: 4 (prefork)
-- ******* ---- .> task events: ON
--- ***** -----
 -------------- [queues]
                .> celery           exchange=celery(direct) key=celery


[tasks]
  . tasks.task

[2018-09-29 16:30:34,998: INFO/MainProcess] Connected to redis://localhost:6379/0
Connected to redis://localhost:6379/0
[2018-09-29 16:30:36,136: INFO/MainProcess] mingle: searching for neighbors
mingle: searching for neighbors

However, when you dockerize your Celery worker (celery worker --app=worker.app --loglevel=INFO), start it up with docker-compose up -d and fetch the logs with docker logs ..., something unexpected happens - the worker startup banner is missing (clone the GitHub example repo to reproduce it).

[2018-09-30 09:22:51,849: INFO/MainProcess] Connected to redis://redis:6379/0
[2018-09-30 09:22:51,872: INFO/MainProcess] mingle: searching for neighbors
[2018-09-30 09:22:52,904: INFO/MainProcess] mingle: all alone
[2018-09-30 09:22:52,917: INFO/MainProcess] worker@f6d948a65d3f ready.

It gets even more mysterious if you bash into your running worker container to start up another Celery worker manually.

$ docker-compose exec worker bash
root@8ec76438799a:/app# celery worker --app=worker.app --loglevel=INFO

[2018-10-01 09:06:24,990: INFO/MainProcess] Customize Celery logger, default handler: <StreamHandler <stderr> (NOTSET)>

 -------------- celery@8ec76438799a v4.2.0 (windowlicker)
---- **** -----
--- * ***  * -- Linux-4.9.93-linuxkit-aufs-x86_64-with-debian-9.5 2018-10-01 09:06:25
-- * - **** ---
- ** ---------- [config]
- ** ---------- .> app:         __main__:0x7f4ef055f358
- ** ---------- .> transport:   redis://redis:6379/0
- ** ---------- .> results:     redis://redis:6379/0
- *** --- * --- .> concurrency: 2 (prefork)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** -----
 -------------- [queues]
                .> celery           exchange=celery(direct) key=celery


[tasks]
  . tasks.task

[2018-10-01 09:06:25,095: INFO/MainProcess] Connected to redis://redis:6379/0
[2018-10-01 09:06:25,105: INFO/MainProcess] mingle: searching for neighbors

Yay, there it is! Well, at least there, but unfortunately no sign of this new worker banner in the docker logs (docker-compose logs worker). For some reason, docker does not capture this output. What on earth is going on here? Spoiler: Event though I will show you a solution to make the banner show up in the logs, I still haven’t fully understood the root cause. If you fully understand what’s going on here, please come forward and comment below!.

So, in order to close in on this mystery let’s have a look at the Celery source code. When the worker starts up, it is print(..., file=sys.__stdout__) that is responsible for printing the banner to the console. To /dev/stdout which redirects to /proc/self/fd/1 to be more precise. Which, in turn, should all be fine according to the docker docs:

By default, docker logs or docker service logs shows the command’s output just as it would appear if you ran the command interactively in a terminal.

This doesn’t seem to be the case here. Or does it… and it’s not a docker but a Python logging issue?

Let’s go back to the Celery side of things and examine the default logger by connecting the after_setup_logger signal (for more details on Celery logging, have a look at this blog post: 3 Strategies to Customise Celery logging handlers):

import os
import logging
from celery import Celery
from celery.signals import after_setup_logger


app = Celery()
app.conf.update({
    'broker_url': os.environ['CELERY_BROKER_URL'],
    'imports': ('tasks', ),
    'task_serializer': 'json',
    'result_serializer': 'json',
    'accept_content': ['application/json'],
    'result_backend ': os.environ['CELERY_RESULT_BACKEND']
})


@after_setup_logger.connect
def setup_loggers(logger, *args, **kwargs):
    logger.info(f'Customize Celery logger, default handler: {logger.handlers[0]}')

This produces an extra line which tells us that the default handler is a StreamHandler which writes to sys.stderr.

[2018-09-30 09:32:00,130: INFO/MainProcess] Customize Celery logger, default handler: <StreamHandler <stderr> (NOTSET)>

Remember how the worker uses print(file=sys.__stdout__) to print the banner? Even though print and logger seem totally unrelated to me, adding a StreamHandler(sys.stdout) handler to the Celery logger (ie explicitly stream to stdout), makes the startup banner appear in the docker logs:

@after_setup_logger.connect
def setup_loggers(logger, *args, **kwargs):
    logger.addHandler(logging.StreamHandler(sys.stdout))

So we have a solution but no good answer. Is it Docker or is it Python? Are you able to find the missing jigsaw puzzle piece?

All source code examples used in this blog post can be found on GitHub:
https://github.com/ZoomerAnalytics/celery-docker-missing-banner

If there's an aspect that I didn't cover, or you have specific questions (or find something wrong), let me know in the comments - I'll answer them as best as I can.



Get distributed Python tips straight to your inbox