Async Iterators and Generators
Swipe to show menu
Regular iterators produce values synchronously – each next() call returns immediately. Async iterators allow each value to involve I/O: fetching a page, reading a chunk, waiting for an event. The async for loop handles the awaiting automatically.
async for – Iterating Over Async Streams
Use async for to consume an async iterator inside an async def function:
123456789101112131415161718192021import asyncio import httpx import nest_asyncio nest_asyncio.apply() # Async generator that yields post titles one at a time async def post_titles(client, post_ids): for post_id in post_ids: response = await client.get( f"https://jsonplaceholder.typicode.com/posts/{post_id}" ) data = response.json() yield data["title"] # Yielding after each network call async def main(): async with httpx.AsyncClient() as client: async for title in post_titles(client, [1, 2, 3, 4, 5]): print(title) asyncio.run(main())
Each iteration awaits the next HTTP request before yielding. The caller doesn't manage the await – async for handles it.
Writing Async Generators
An async generator is an async def function that contains yield. It produces values lazily, with each value potentially involving async work:
12345678910111213141516171819202122232425262728import asyncio import httpx import nest_asyncio nest_asyncio.apply() # Paginating through posts in batches async def paginate_posts(client, total, batch_size): for start in range(1, total + 1, batch_size): end = min(start + batch_size, total + 1) batch_ids = list(range(start, end)) responses = await asyncio.gather( *[ client.get(f"https://jsonplaceholder.typicode.com/posts/{post_id}") for post_id in batch_ids ] ) for response in responses: yield response.json()["title"] async def main(): async with httpx.AsyncClient() as client: async for title in paginate_posts(client, total=10, batch_size=3): print(title) asyncio.run(main())
Writing a Full Async Iterator Class
For more control, implement __aiter__ and __anext__ directly:
1234567891011121314151617181920212223242526272829303132import asyncio import httpx import nest_asyncio nest_asyncio.apply() # Async iterator that fetches posts one at a time class PostIterator: def __init__(self, client, post_ids): self.client = client self.post_ids = iter(post_ids) def __aiter__(self): return self async def __anext__(self): try: post_id = next(self.post_ids) except StopIteration: raise StopAsyncIteration # Signals end of iteration response = await self.client.get( f"https://jsonplaceholder.typicode.com/posts/{post_id}" ) return response.json()["title"] async def main(): async with httpx.AsyncClient() as client: async for title in PostIterator(client, [1, 2, 3]): print(title) asyncio.run(main())
Async Generators vs Async Iterators
Thanks for your feedback!
Ask AI
Ask AI
Ask anything or try one of the suggested questions to begin our chat