- Published at
Django-Q2: Asynchronous Task Management Made Easy
A guide to setting up and using Django-Q2 for asynchronous task management in Django projects.
- Authors
-
-
- Name
- James Lau
- Indie App Developer at Self-employed
-
Table of Contents
Django-Q2 is a powerful, yet easy-to-use, asynchronous task queue for Django. It allows you to offload time-consuming tasks to background processes, improving your web application’s responsiveness and overall performance. This guide walks you through setting up and using Django-Q2 in your Django projects.
Installation
First, install the django-q2 package using pip:
pip install django-q2
Next, add django_q to your INSTALLED_APPS in your project’s settings.py:
INSTALLED_APPS = (
# other apps
'django_q',
)
Run Django migrations to create the necessary database tables:
python manage.py migrate
Configuration
Configure Django-Q2 by adding a Q_CLUSTER dictionary to your settings.py. This dictionary defines the settings for the task queue cluster. Here’s a sample configuration:
Q_CLUSTER = {
'name': 'myproject',
'workers': 1,
'recycle': 500,
'timeout': 60,
'compress': True,
'cpu_affinity': 1,
'save_limit': 250,
'queue_limit': 500,
'label': 'Django Q',
'redis': { # Config same as redis.StrictRedis
'host': '192.168.0.207',
'port': 6379,
'db': 0,
"password": "_34dff[l]",
}
}
Explanation of Configuration Options:
name: The name of your cluster.workers: The number of worker processes to start. A good starting point is the number of CPU cores you have.recycle: The maximum number of tasks a worker will process before being recycled. This helps prevent memory leaks.timeout: The maximum time (in seconds) a task is allowed to run before being killed.compress: Whether to compress task data.cpu_affinity: Assigns a worker to a specific CPU core (1 means the first core).save_limit: The maximum number of successful tasks to store the results of.queue_limit: The maximum number of tasks allowed in the queue.label: A human-readable label for the cluster.redis: The Redis connection settings. Django-Q2 uses Redis as its message broker.
Demo: Running a Simple Task
Create hooks.py in the same directory as manage.py:
def print_result(task):
print(f"The result is: {task.result}")
Create tasks.py in the same directory as manage.py:
def add(a: int, b: int) -> int:
return a + b
Create playground.py in the same directory as manage.py:
if __name__ == '__main__':
import os
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'myapp.settings')
import django
django.setup()
import tasks
from django_q.tasks import async_task, result
async_task(tasks.add, 2, 3, hook='hooks.print_result')
Explanation:
hooks.py: Contains a hook function that will be executed after the task completes. In this case, it prints the task’s result.tasks.py: Contains the actual task function,add, which simply adds two numbers.playground.py: A script that sets up the Django environment and then usesasync_taskto enqueue theaddtask. Thehookargument specifies the hook function to be called when the task finishes.
Run the playground.py script to create a task:
python playground.py
Running the Worker
Start the Django-Q2 worker process using the qcluster management command:
python manage.py qcluster
This command starts the worker process, which will listen for tasks in the queue and execute them.
Periodic Tasks
Django-Q2 supports scheduled tasks. You can create a scheduled task using the Schedule model:
from django_q.models import Schedule
Schedule.objects.create(
func="jobs.task_management.scrape_jobs.scrape_jobs",
hook='hooks.print_result',
schedule_type=Schedule.DAILY
)
This example creates a daily scheduled task that runs the scrape_jobs function and calls the print_result hook after it completes.
Monitoring
To monitor the Django-Q2 cluster, you can use the qmonitor management command. First, install the blessed package:
pip install blessed
Then, run the monitor:
python manage.py qmonitor
This will provide a real-time view of the queue, workers, and task status.
Conclusion
Django-Q2 simplifies asynchronous task management in Django. By following these steps, you can easily integrate it into your projects and improve their performance by offloading time-consuming tasks to background processes.