Implementing Asynchronous Tasks with Celery
This challenge focuses on building a robust and scalable application by leveraging Celery for asynchronous task execution. You will learn how to define, trigger, and monitor background tasks, enabling your application to handle time-consuming operations without blocking the main execution flow. This is crucial for improving user experience and application responsiveness.
Problem Description
Your task is to implement a simple task processing system using Celery. You will create a Flask application that can trigger Celery tasks to perform operations like sending emails and processing data in the background. The goal is to demonstrate the core concepts of Celery, including task definition, task invocation, and basic result retrieval.
Key Requirements:
- Celery Setup: Configure Celery with a message broker (Redis is recommended) and a result backend.
- Task Definition: Define at least two distinct Celery tasks:
- A task that simulates sending an email (e.g., prints a message to the console).
- A task that performs a simple calculation, like squaring a number.
- Flask Integration: Create a minimal Flask application with two endpoints:
- An endpoint to trigger the "send email" task.
- An endpoint to trigger the "square number" task and retrieve its result.
- Task Invocation: Within the Flask endpoints, invoke the Celery tasks asynchronously.
- Result Retrieval: For the "square number" task, ensure you can retrieve its result from the result backend.
Expected Behavior:
- When the "send email" endpoint is hit, the task should be queued and executed by a Celery worker without blocking the Flask request. The Flask application should return a confirmation message immediately.
- When the "square number" endpoint is hit with a number, the task should be queued, and the Flask application should return a task ID. A subsequent request (or logic within the application) should be able to retrieve the squared result using this task ID.
- Celery workers should be able to pick up and execute these tasks.
Edge Cases:
- Consider what happens if the Celery worker is not running. The tasks should still be queued by the broker.
- Consider how to handle potential errors within a Celery task (though explicit error handling for this challenge is optional, understanding the concept is key).
Examples
Example 1: Triggering the "Send Email" Task
Request:
POST /send-email
Flask Application Output (immediate):
Email task queued with ID: <task_id>
Celery Worker Output (after task is picked up):
[INFO/MainProcess] Task tasks.send_email[<task_id>] received
Sending email to user@example.com with subject: Welcome!
[INFO/MainProcess] Task tasks.send_email[<task_id>] succeeded in X.XXs: None
Explanation: The Flask application receives a request, creates a Celery task for sending an email, and immediately returns a confirmation message including the task ID. The Celery worker picks up this task and prints a simulated email sending message.
Example 2: Triggering the "Square Number" Task and Retrieving Result
Request 1 (to trigger task):
POST /square-number
Body: {"number": 5}
Flask Application Output (immediate):
Square task queued with ID: <task_id_square>
Celery Worker Output (after task is picked up):
[INFO/MainProcess] Task tasks.square_number[<task_id_square>] received
[INFO/MainProcess] Task tasks.square_number[<task_id_square>] succeeded in X.XXs: 25
Request 2 (to retrieve result, assuming task ID is <task_id_square>):
GET /task-result/<task_id_square>
Flask Application Output:
The result of task <task_id_square> is: 25
Explanation: The Flask application receives a request to square a number. It queues a Celery task with the number 5. It returns the task ID. The Celery worker executes the task, calculates 5*5=25, and stores the result. A subsequent GET request uses the task ID to fetch the result from the backend.
Constraints
- Python Version: Python 3.7+
- Dependencies: You must use
celery,flask, andredis(for broker and backend). - Message Broker: Use Redis for the Celery message broker.
- Result Backend: Use Redis for the Celery result backend.
- Task Execution: Tasks must be executed asynchronously by a separate Celery worker process.
- Endpoint Logic: Flask endpoints should only be responsible for queuing tasks and retrieving results, not performing the heavy lifting.
Notes
- Celery Configuration: Pay close attention to how you configure Celery with your broker and backend URLs.
- Task Definition Location: It's common practice to define Celery tasks in a separate module (e.g.,
tasks.py). - Worker Startup: You will need to run the Celery worker process separately from your Flask application. The command typically looks like
celery -A your_app worker -l info. - Result Backend: For result retrieval, you'll use the
AsyncResultobject provided by Celery. - Error Handling (Optional but Recommended): While not strictly required for basic functionality, explore how Celery handles task failures and how you might implement retry mechanisms or error logging.