Supercharging Your Django App with Celery Asynchronous Tasks

Harun Tanrıverdi
3 min readJul 29, 2023

--

Celery: Celery is an asynchronous task queue system written in Python, which is designed to distribute time-consuming tasks across multiple workers asynchronously. It’s perfect for handling tasks that can be executed in the background, such as sending emails, processing large datasets, or handling periodic tasks like generating reports. Celery follows a distributed architecture and can be integrated with various message brokers like Redis, RabbitMQ, or Amazon SQS (Simple Queue Service).

For further information you can visit celery official website.

Why We Need for Celery in Django?

Django follows a synchronous request-response cycle, meaning that when a user makes a request to the server, the server processes the request and generates a response. However, certain tasks, like heavy computations, file processing, or external API calls, can introduce delays and slow down the server’s response time. Django server can continue serving user requests without waiting for the completion of these tasks. Here’s where Celery comes into play. By offloading these time-consuming tasks to background workers, you can improve the user experience and also enhance the scalability and responsiveness of your Django App.

Let’s get to the installation without further ado.

1. Setting up Celery:

Install Celery

After you created virtual environment using Anaconda or venv, install celery celery:

pip install celery

Configure Celery:

Good news, Celery supports Django integration officially since 3.1 version. Here is the best practice to use with django apps.

Firstly, create a new file called celery.py in your Django project's root directory.

import os
from celery import Celery

os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'mainapp.settings')

app = Celery('mainapp')
app.config_from_object('django.conf:settings', namespace='CELERY')

app.autodiscover_tasks()

This configuration allows us register all task in django. Then yout need to import celery app in mainapp/mainapp/__init__.py

from .celery import app as celery_app

__all__ = ('celery_app',)

Lastly, let’s define some configuration in settings.py. Celery uses a message broker to handle the communication between Django and Celery workers. Popular message brokers are RabbitMQ and Redis. For this example, we’ll use Redis. You can install Redis and start its server following the official documentation: https://redis.io/docs/getting-started/installation/

CELERY_TIMEZONE = 'Europe/Istanbul'
CELERY_TASK_TRACK_STARTED = True
CELERY_TASK_TIME_LIMIT = 30 * 60
CELERY_BROKER_URL = os.environ.get("CELERY_BROKER", "redis://redis:6379/0")
CELERY_RESULT_BACKEND = os.environ.get("CELERY_BROKER", "redis://redis:6379/0")
CELERY_BEAT_SCHEDULE = {
'create_task_mainapp': {
'task': 'main.tasks.create_task_mainapp',
'schedule': 5.0,
},
}

As use case, we will create own job scheduler that read data from a endpoint for every five seconds and keep the required part into the database.

Now it’s time to create a Celery task. A task is a Python function that performs some asynchronous operation. Create a new file called tasks.py inside your Django app folder.

from django.utils.timezone import make_aware
from celery import shared_task
from main.models import Transactions
import time
import logging
import requests
import datetime

logger = logging.getLogger(__name__)

@shared_task
def create_task_main():
try:
response = requests.get("https://hub.dummyapis.com/delay?seconds=2")
if response.status_code == 200:
data = response.json()
Transactions.objects.create(
id=data["id"],
name=data["name"],
timestamp=make_aware(datetime.datetime.fromtimestamp(float(data["timestamp"])))
)
return dict(success=True)

except requests.HTTPError as e:
logger.error('e.code: %s' % str(e.response.status_code))
return dict(success=False, error_code=str(e.response.status_code))

Dummy api endpoint are used in this use case that modelize the real life scenario. When we start the django and celery service, application will run the create_task_main task and request an endpoint in every 5 seconds in background. You can configure as you demand.

Running Celery Worker

To start processing Celery tasks, you need to run the Celery worker. Open a terminal, navigate to your Django project directory, and run the following command:

celery -A mainapp worker -l info

In this guide, we’ve explored the powerful combination of Django and Celery, and how it can revolutionize the way you develop and deploy your web applications. By leveraging Celery’s asynchronous task processing capabilities, you can supercharge your Django app and provide a seamless user experience while efficiently handling time-consuming operations in the background.

--

--

No responses yet