Integrating Celery with Django Using Redis as the Message Broker
Celery is an asynchronous task queue/job queue based on distributed message passing. It is focused on real-time operation but supports scheduling as well. The execution units, called tasks, are executed concurrently on one or more worker servers using multiprocessing, Eventlet, or gevent. Tasks can execute asynchronously (in the background) or synchronously (wait until ready).
In this post, we'll walk through integrating Celery with Django using Redis as the message broker. We'll cover the setup and provide example code to get you started.
Prerequisites
Before we begin, make sure you have the following installed:
- Python 3.x
- Django
- Redis
You can install Redis on your system or use a Docker container.
Step 1: Install Celery and Redis
First, install Celery and the Redis library using pip:
pip install celery[redis]
Step 2: Configure Django Settings
Add Celery configuration to your Django project's settings. Open `settings.py` and add the following configurations:
# settings.py
# Celery settings
CELERY_BROKER_URL = 'redis://localhost:6379/0' # URL to the Redis broker
CELERY_RESULT_BACKEND = 'redis://localhost:6379/0' # URL to the Redis backend
CELERY_ACCEPT_CONTENT = ['json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TIMEZONE = 'UTC'
Step 3: Create a Celery Instance
Create a new file named `celery.py` in your Django project directory (next to `settings.py`), and configure the Celery instance:
# celery.py
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
# Set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'your_project_name.settings')
# Create the Celery app instance
app = Celery('your_project_name')
# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')
# Load task modules from all registered Django app configs.
app.autodiscover_tasks()
@app.task(bind=True)
def debug_task(self):
print(f'Request: {self.request!r}')
Replace `your_project_name` with the name of your Django project.
Step 4: Create Tasks
In your Django app, create a file named `tasks.py` and define some tasks:
# tasks.py
from celery import shared_task
@shared_task
def add(x, y):
return x + y
@shared_task
def mul(x, y):
return x * y
@shared_task
def xsum(numbers):
return sum(numbers)
Step 5: Update `__init__.py`
Ensure the Celery app is loaded when Django starts. Open `__init__.py` in your Django project directory and add the following:
# __init__.py
from __future__ import absolute_import, unicode_literals
# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app
__all__ = ('celery_app',)
Step 6: Run Celery Worker
Start the Celery worker by running the following command:
celery -A your_project_name worker --loglevel=info
Replace `your_project_name` with the name of your Django project.
Step 7: Using Celery Tasks
You can now use Celery tasks in your Django views or any other part of your project. For example:
# views.py
from django.http import HttpResponse
from .tasks import add
def index(request):
result = add.delay(4, 4)
return HttpResponse(f'Task ID: {result.id}, Task Status: {result.status}')
Conclusion
You've successfully integrated Celery with Django using Redis as the message broker. Celery can significantly improve the performance and scalability of your Django applications by offloading long-running tasks to the background. This setup can be expanded with more advanced configurations and tasks tailored to your specific needs. Happy coding!