I’m using pythons asyncio module for running concurrent network requests.
async def fetch(url, max_pages):
async with aiohttp.ClientSession() as session:
tasks = [get_html(session, f"{url}/{i}/") for i in range(1, max_pages + 1)]
await asyncio.gather(*tasks) #this function exits before all respose arrives.
loop = asyncio.get_event_loop()
loop.run_until_complete(fetch(url, max_pages))
What is wrong in my code?
I’m expecting to finish all tasks.
>Solution :
Well I’m not sure without looking at more code. But I thing you can try using,
asyncio.run(fetch(url, max_pages))
This might solve your problem.