Published at

Integrating Celery with Django: A Practical Guide

Integrating Celery with Django: A Practical Guide

A step-by-step guide to integrate Celery with Django for asynchronous task processing, including configuration and troubleshooting.

Authors
  • avatar
    Name
    James Lau
    Twitter
  • Indie App Developer at Self-employed
Sharing is caring!
Table of Contents

Setting up Celery with Django: A Step-by-Step Guide

This guide walks you through integrating Celery, a powerful asynchronous task queue, with your Django project. Celery is useful for handling tasks outside of the request-response cycle, such as sending emails, processing large datasets, or any other time-consuming operation. This prevents your web application from becoming unresponsive.

1. Project Setup

Make sure you have a Django project set up. If not, create one:

django-admin startproject ai_server
cd ai_server
python manage.py startapp myapp # Or any app name you prefer

2. Celery Configuration

a) ai_server/__init__.py

Add the following to your ai_server/__init__.py file. This ensures that the Celery app is loaded when Django starts:

# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app

__all__ = ('celery_app',)

b) ai_server/celery.py

Create a celery.py file in your project directory (ai_server/):

from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
from django.conf import settings

# Set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'ai_server.settings')

app = Celery('ai_server',  broker=settings.CELERY_BROKER_URL)  # Replace 'your_project' with your project's name.

# Configure Celery using settings from Django settings.py.
app.config_from_object('django.conf:settings', namespace='CELERY')

# Load tasks from all registered Django app configs.
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)

This file configures Celery, sets the Django settings module, and automatically discovers tasks in your Django apps.

c) ai_server/settings.py

Add the following Celery-related settings to your ai_server/settings.py:

CELERY_BROKER_URL = f'redis://:{os.environ.get("REDIS_PASSWORD")}@{os.environ.get("REDIS_URL")}:6379/0'
# Celery Configuration Options
CELERY_TIMEZONE = "Asia/Shanghai"
CELERY_TASK_TRACK_STARTED = True
CELERY_TASK_TIME_LIMIT = 30 * 60

Important: Make sure you have Redis installed and running. The CELERY_BROKER_URL specifies the connection details for Redis, which Celery uses as a message broker. Use environment variables to store sensitive information like passwords.

3. Define Celery Tasks

Create a tasks.py file in one of your Django apps (e.g., myapp/tasks.py):

from celery import shared_task

@shared_task
def my_task(arg1, arg2):
    # Task logic here
    result = arg1 + arg2
    return result

The @shared_task decorator makes the function a Celery task. Celery automatically discovers these tasks because of app.autodiscover_tasks() in celery.py.

4. Triggering Tasks

Here’s an example of how to trigger a Celery task:

Create a playground.py file (can be placed anywhere for testing purposes):

if __name__ == '__main__':
    import os
    os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'ai_server.settings')
    import django
    django.setup()

if __name__ == '__main__':
    from myapp.tasks import my_task # Import your task
    result = my_task.delay(3, 5)
    print(result)

Explanation:

  • delay() is the standard way to call a Celery task asynchronously. It adds the task to the Celery queue.
  • delay_on_commit(): This function is available when you want to delay a Celery task until after a Django transaction has been successfully committed to the database. If the transaction rolls back, the task will not be executed.

5. Running Celery Worker

Open a new terminal and start the Celery worker:

For Windows:

First, install eventlet:

pip install eventlet

Then, run the Celery worker:

celery -A ai_server worker --loglevel=info -P eventlet

The -P eventlet option is crucial for Windows to handle concurrency.

For Linux/macOS:

celery -A ai_server worker --loglevel=info
  • -A ai_server: Specifies the Celery app instance (your project’s name).
  • worker: Starts the Celery worker.
  • --loglevel=info: Sets the logging level to INFO.

Troubleshooting Tips

  • Redis Connection: Ensure Redis is running and accessible at the URL specified in CELERY_BROKER_URL.
  • Task Discovery: Double-check that INSTALLED_APPS in settings.py includes the apps containing your tasks.py files.
  • Celery Beat: For scheduled tasks, you’ll need to configure Celery Beat. This is beyond the scope of this basic setup but is well-documented in the Celery documentation.

This guide provides a foundation for using Celery with Django. As your project grows, explore more advanced Celery features like task routing, custom serializers, and monitoring tools.

Sharing is caring!