Asynchronous Data Fetching from Random URLs

  • Share this:

Code introduction


This code defines an asynchronous function `fetch_random_url` that fetches data from a randomly selected URL. Then, a function `fetch_all_urls` is defined that creates multiple asynchronous tasks to fetch data from multiple random URLs concurrently. Finally, the `get_random_data` function uses `asyncio.run` to run the `fetch_all_urls` function and return the results.


Technology Stack : Python, aiohttp, asyncio

Code Type : Asynchronous HTTP request

Code Difficulty : Intermediate


                
                    
import random
import aiohttp
import asyncio

def fetch_random_url(session):
    url = "http://jsonplaceholder.typicode.com/todos/{}".format(random.randint(1, 100))
    async with session.get(url) as response:
        return await response.json()

async def fetch_all_urls():
    async with aiohttp.ClientSession() as session:
        tasks = [fetch_random_url(session) for _ in range(5)]
        return await asyncio.gather(*tasks)

def get_random_data():
    return asyncio.run(fetch_all_urls())