Resolving Celery "Received unregistered task of type" error

Python

23 Apr 2020 | 4 minute read

When implementing Celery with FastAPI I experienced a confusing issue. Celery just wouldn't register my tasks. In this article, I will go in-depth on why this issue occurred to me and how I solved it. Hopefully, it can help you solve the error as well.

Did you remember to import the module containing this task?

When starting Celery, it would report an error saying Received unregistered task of type, and asking me if I've imported the module containing the task or not.

As I was 100% sure that I've done so, I was confused. Why wasn't Celery finding recognizing my imported modules?

celery-worker_1  | [2020-04-23 11:48:59,696: ERROR/MainProcess] Received unregistered task of type 'app.tasks.capture_payment_create_order'.
celery-worker_1  | The message has been ignored and discarded.
celery-worker_1  |
celery-worker_1  | Did you remember to import the module containing this task?
celery-worker_1  | Or maybe you're using relative imports?
celery-worker_1  |
celery-worker_1  | Please see
celery-worker_1  | http://docs.celeryq.org/en/latest/internals/protocol.html
celery-worker_1  | for more information.
celery-worker_1  |
celery-worker_1  | The full contents of the message body was:
celery-worker_1  | b'[[1, 1, 1], {}, {"callbacks": null, "errbacks": null, "chain": null, "chord": null}]' (84b)
celery-worker_1  | Traceback (most recent call last):
celery-worker_1  |   File "/usr/local/lib/python3.7/site-packages/celery/worker/consumer/consumer.py", line 562, in on_task_received
celery-worker_1  |     strategy = strategies[type_]
celery-worker_1  | KeyError: 'app.tasks.capture_payment_create_order'

Enabling debug logs for Celery, I was looking to get some more insights on what's going on.

$ celery worker -A app.celery --loglevel=DEBUG

When exploring the logs, I confirmed that my applications tasks were not recognized by Celery.

celery-worker_1  | [tasks]
celery-worker_1  |   . celery.accumulate
celery-worker_1  |   . celery.backend_cleanup
celery-worker_1  |   . celery.chain
celery-worker_1  |   . celery.chord
celery-worker_1  |   . celery.chord_unlock
celery-worker_1  |   . celery.chunks
celery-worker_1  |   . celery.group
celery-worker_1  |   . celery.map
celery-worker_1  |   . celery.starmap

I was quite confused, as I've never experienced this issue before. Celery was configured to include all tasks for the application, so that wouldn't be the issue, as you can see in the following snippet.

celery = Celery(
    "gringotts", broker=SETTINGS.CELERY_BROKER, include=["app.tasks"]
)

I tried various ways of including tasks, restarting the workers and the server as well as downgrading Celery.

Scratching my head and trying various solutions to the problems I decided to stop and remove all docker containers to reset my environment.

$ docker stop $(docker ps -a -q)
$ docker rm $(docker ps -a -q)

Restarting the application...

celery-worker_1  | [tasks]
celery-worker_1  |   . app.tasks.capture_payment_create_order
celery-worker_1  |   . celery.accumulate
celery-worker_1  |   . celery.backend_cleanup
celery-worker_1  |   . celery.chain
celery-worker_1  |   . celery.chord
celery-worker_1  |   . celery.chord_unlock
celery-worker_1  |   . celery.chunks
celery-worker_1  |   . celery.group
celery-worker_1  |   . celery.map
celery-worker_1  |   . celery.starmap

And voila, the tasks are were now recognized.

I was once again reminded to try the simple things first. 🙂