Notice: This page requires JavaScript to function properly.
Please enable JavaScript in your browser settings or update your browser.
Aprende Running Concurrently with asyncio.gather() | Tasks and Scheduling
Python Asyncio in Depth

Running Concurrently with asyncio.gather()

Desliza para mostrar el menú

asyncio.create_task() gives you fine-grained control over individual tasks. When you just want to run a group of coroutines concurrently and collect their results, asyncio.gather() is the cleaner tool.

Basic Usage

asyncio.gather() takes any number of coroutines or tasks, runs them concurrently, and returns a list of their results in the same order as the inputs – regardless of which one finished first.

12345678910111213141516171819202122232425
import asyncio import httpx import nest_asyncio nest_asyncio.apply() # Fetching multiple posts concurrently async def fetch_post(client, post_id): url = f"https://jsonplaceholder.typicode.com/posts/{post_id}" response = await client.get(url) data = response.json() return data["title"] async def main(): async with httpx.AsyncClient() as client: titles = await asyncio.gather( fetch_post(client, 1), fetch_post(client, 2), fetch_post(client, 3), ) for index, title in enumerate(titles, start=1): print(f"Post {index}: {title}") asyncio.run(main())

The results list preserves input order – titles[0] is always the result of fetch_post(client, 1), even if post 3 finished first.

Unpacking a List of Coroutines

When the number of tasks is dynamic, use the * unpacking operator:

1234567891011121314151617181920212223
import asyncio import httpx import nest_asyncio nest_asyncio.apply() # Fetching a variable number of posts async def fetch_post(client, post_id): url = f"https://jsonplaceholder.typicode.com/posts/{post_id}" response = await client.get(url) return response.json()["title"] async def main(): post_ids = [1, 2, 3, 4, 5] async with httpx.AsyncClient() as client: coroutines = [fetch_post(client, post_id) for post_id in post_ids] titles = await asyncio.gather(*coroutines) for title in titles: print(title) asyncio.run(main())

Handling Exceptions with return_exceptions

By default, if one coroutine raises an exception, gather() immediately cancels the rest and re-raises it. Setting return_exceptions=True changes this behavior – exceptions are returned as results instead of being raised.

12345678910111213141516171819202122232425262728293031
import asyncio import httpx import nest_asyncio nest_asyncio.apply() # Fetching posts – one with an invalid ID to trigger an error async def fetch_post(client, post_id): url = f"https://jsonplaceholder.typicode.com/posts/{post_id}" response = await client.get(url) data = response.json() if not data: raise ValueError(f"No data for post {post_id}") return data["title"] async def main(): async with httpx.AsyncClient() as client: results = await asyncio.gather( fetch_post(client, 1), fetch_post(client, 99999), # Returns empty – raises ValueError fetch_post(client, 3), return_exceptions=True, ) for index, result in enumerate(results, start=1): if isinstance(result, Exception): print(f"Post {index} failed: {result}") else: print(f"Post {index}: {result}") asyncio.run(main())

gather() vs create_task()

question mark

What does asyncio.gather() return?

Selecciona la respuesta correcta

¿Todo estuvo claro?

¿Cómo podemos mejorarlo?

¡Gracias por tus comentarios!

Sección 2. Capítulo 2

Pregunte a AI

expand

Pregunte a AI

ChatGPT

Pregunte lo que quiera o pruebe una de las preguntas sugeridas para comenzar nuestra charla

Sección 2. Capítulo 2
some-alt