How to do Asynchronous Task in Django?

Django

Summary: In this tutorial, you will get the overview of performing an asynchronous task using clerey in Django.

Asynchronous Task using Clerey in Django

In Django, you can use a task queue such as Celery to handle asynchronous tasks. Here’s a general overview of the process:

  1. Install and configure Celery in your Django project: You can install Celery using pip, and then add it to your Django project by adding it to the installed apps in your project’s settings.py file, and configuring the broker (e.g. Redis, RabbitMQ) that Celery will use to send and receive messages.
  2. Create tasks as Python functions, decorated with Celery’s @task decorator: Once Celery is set up, you can create tasks by defining Python functions and decorating them with the @task decorator. This tells Celery that the function should be run as a task, and makes it available for you to use in your views or other code.
  3. Use the apply_async() method to add tasks to the queue: To add a task to the queue, you can use the apply_async() method on the task function. You can pass any necessary arguments to the task function as a list of arguments.
  4. Start a Celery worker to process the tasks in the queue: Once you have added tasks to the queue, you need to start a worker process that will listen for new tasks and execute them. You can start a worker by running the celery worker command from the command line.

Let’s see a basic example of using celery in Django:

1. Installing and configuring Celery

Start by installing Celery using the pip:

pip install celery

Then, in your Django project’s settings.py file, add celery to the INSTALLED_APPS list and configure the broker.

Here is an example of how to configure Celery to use Redis as the broker:

# settings.py
INSTALLED_APPS = [
    ...
    'celery',
    ...
]

CELERY_BROKER_URL = 'redis://localhost:6379/0'
CELERY_RESULT_BACKEND = 'redis://localhost:6379/0'

2. Creating a task

Once you are done configuring celery in settings.py, create an asynchronous task in any of your Django app.

For this, you will have to use the task decorator from the celery module to mark the task as asynchronous and to make it compatible to use with other asynchronous functions.

Here is an example of a simple asynchronous task that sends an email:

from celery import task
from django.core.mail import send_mail

@task
def send_email(email):
    send_mail(
        'This is the subject of the mail',
        'My mail message goes here.',
        'admin@pencilprogrammer.com',
        [email],
        fail_silently=False,
    )

3. Adding a task to the queue

Once you are done creating the task, import it into a different .py file and call the apply_async() method by providing the necessary arguments.

from myapp.tasks import send_email
send_email.apply_async(args=[email])

The apply_async method in Python is used to perform task in parallel to other task. You can read more about it here.

You can also use .delay() which is shorthand for apply_async((args), task_id=None, link=None, link_error=None, shadow=None, **options)

from myapp.tasks import send_email
send_email.delay(email)

4. Starting the worker

After this, you can start a Celery worker by running the following command from the command line:

celery -A myproject worker --loglevel=info

Here, myproject is the name of your Django project.

Please keep in mind that this is a very basic example, and you may need to add more complex configuration, error handling and other features depending on the requirements of your project.

What is message broker here?

A message broker is a software that acts as an intermediary between applications that need to send and receive messages.

In the context of Celery, the message broker is responsible for receiving tasks from the client application (your Django project) and storing them in a queue. It then sends these tasks to a worker process, which retrieves them from the queue and executes them.

In the above example, I’ve used Redis as the message broker, which is a popular choice for use with Celery. But there are other message brokers as well like RabbitMQ, Kafka, sqs etc.

The message broker is responsible for managing the queue and ensuring that tasks are delivered to the worker processes in the correct order.

This allows the client application to send tasks to the queue and continue executing, without having to wait for the task to complete. The worker processes can then retrieve tasks from the queue and execute them in parallel, improving the performance of your application.

Why Clerey?

Celery is a widely-used task queue for Python, and it’s well-suited for use in Django projects because it can integrate easily with the Django project structure and database models.

One of the main benefits of using a task queue like Celery is that it allows you to run long-running tasks or tasks that need to be executed at a specific time asynchronously, without blocking the execution of other parts of your application.

This can improve the performance and responsiveness of your application by allowing it to handle multiple requests simultaneously.

Celery also provides a lot of other features out of the box, such as:

  • Retrying failed tasks
  • Scheduling tasks to run at a specific time
  • Monitoring task progress and status
  • Distributed task execution using multiple worker processes
  • Support for multiple message brokers such as Redis, RabbitMQ, and others.

So, Celery is the perfect choice when you have to execute a task that takes a long time to complete, or when you want to schedule a task to run at a specific time.

Leave a Reply

Your email address will not be published. Required fields are marked *