Module 05 / Project 02 — Concurrent Requests¶
Home: README · Module: Async Python
Learn Your Way¶
| Read | Build | Watch | Test | Review | Visualize | Try |
|---|---|---|---|---|---|---|
| — | This project | — | — | Flashcards | — | — |
Focus¶
aiohttpfor async HTTP requestsaiohttp.ClientSessionfor connection reuseasyncio.gather()to fetch multiple URLs at once- Comparing sync vs async request times
Why this project exists¶
The real power of async shows up when you need to fetch data from many sources. Instead of waiting for each response before making the next request, you fire them all at once. This project makes that difference concrete.
Run¶
cd projects/modules/05-async-python/02-concurrent-requests
pip install -r ../requirements.txt
python project.py
Expected output¶
--- Sync requests (one at a time) ---
Fetched post 1: "sunt aut facere..." (270 chars) in 0.3s
Fetched post 2: "qui est esse..." (230 chars) in 0.2s
...
Sync total: ~2.5 seconds for 10 requests
--- Async requests (all at once) ---
Fetched post 1: "sunt aut facere..." (270 chars) in 0.3s
Fetched post 3: "ea molestias..." (250 chars) in 0.3s
...
Async total: ~0.4 seconds for 10 requests
Alter it¶
- Increase from 10 to 50 URLs. How do the sync vs async times compare now?
- Add a semaphore (
asyncio.Semaphore(5)) to limit concurrent requests to 5 at a time. - Fetch from two different API endpoints in the same
gather()call.
Break it¶
- Remove
async with session:and just usesession = aiohttp.ClientSession(). What happens? - Fetch a URL that does not exist. How does error handling differ from sync?
- Set a very short timeout (0.001 seconds). What exception do you get?
Fix it¶
- Add
try/exceptaround each fetch so one failure does not crash the whole batch. - Add
asyncio.wait_for()with a timeout per request. - Close the session properly in a
finallyblock if you removed theasync with.
Explain it¶
- Why is
aiohttpneeded instead of just usingrequestswithasyncio? - What does
ClientSessiondo that individual requests do not? - How does a semaphore help when making hundreds of requests?
- What happens if you
awaiteach request individually instead of usinggather()?
Mastery check¶
You can move on when you can: - use aiohttp to fetch multiple URLs concurrently, - explain why requests (sync) cannot be used with asyncio effectively, - add error handling per-request in a batch, - use a semaphore to limit concurrency.