How to Set Up Proxies with AIOHTTP in Python
When building web scraping tools or asynchronous HTTP clients in Python, handling IP restrictions and access blocks is a crucial challenge. Using proxies can significantly improve your request reliability and help you avoid getting blocked. In this guide, we'll explore how to configure proxies with the popular asynchronous HTTP library AIOHTTP. We'll use DataImpulse as our proxy provider, a cost-effective solution offering residential proxies with flexible authentication and IP rotation features.
Why Use Proxies with AIOHTTP?
When making repeated HTTP requests, especially for scraping, websites often block IPs that flood their servers with traffic. Proxies act as intermediaries, masking your IP address and allowing more stable, secure, and access-friendly communications. AIOHTTP is perfect for this because it supports asynchronous networking and proxy integration, making your scraping fast and scalable.
Getting Started: Prerequisites
Before jumping in, ensure you have:
- Python 3.6+ installed on your system.
- An active proxy plan from DataImpulse with your username and password ready.
Installing Required Libraries
Install aiohttp and asyncio if you haven’t yet:
pip install aiohttp asyncio
Alternatively, on Windows command prompt:
python -m pip install aiohttp
python -m pip install asyncio
Basic HTTP Request with AIOHTTP
Let's start with a simple example that sends a request to ip-api.com, which returns the requester's IP address:
import aiohttp
import asyncio
async def get_response():
async with aiohttp.ClientSession() as session:
async with session.get('https://ip-api.com/') as response:
print('Status Code:', response.status)
print('Body:', await response.text())
loop = asyncio.get_event_loop()
loop.run_until_complete(get_response())
Run this script to see your IP and the response status. This confirms that AIOHTTP works for basic requests.
Integrating Proxies with AIOHTTP
To add proxy support, you’ll need your proxy server address and authentication credentials from DataImpulse.
Step 1: Define Proxy Credentials
PROXY_END_POINT = 'gw.dataimpulse.com:823' # Example for DataImpulse residential proxy
USERNAME = 'YourProxyPlanUsername'
PASSWORD = 'YourProxyPlanPassword'
Make sure to replace these placeholders with your actual proxy credentials.
Step 2: Send Requests via Proxy
Here's how to include your proxy in the request:
import aiohttp
import asyncio
PROXY_END_POINT = 'gw.dataimpulse.com:823'
USERNAME = 'YourProxyPlanUsername'
PASSWORD = 'YourProxyPlanPassword'
async def get_response_using_proxy():
proxy_url = f'http://{USERNAME}:{PASSWORD}@{PROXY_END_POINT}'
async with aiohttp.ClientSession() as session:
async with session.get('https://ip-api.com/', proxy=proxy_url) as response:
print('Status Code:', response.status)
print('Body:', await response.text())
loop = asyncio.get_event_loop()
loop.run_until_complete(get_response_using_proxy())
This includes proxy authentication inside the proxy URL itself.
Using Basic Authentication for Proxies
Alternatively, AIOHTTP supports BasicAuth to handle proxy authentication separately, which can make your code cleaner:
import aiohttp
import asyncio
PROXY_END_POINT = 'gw.dataimpulse.com:823'
USERNAME = 'YourProxyPlanUsername'
PASSWORD = 'YourProxyPlanPassword'
async def get_response_using_proxy():
async with aiohttp.ClientSession() as session:
async with session.get(
'https://ip-api.com/',
proxy=f'http://{PROXY_END_POINT}',
proxy_auth=aiohttp.BasicAuth(USERNAME, PASSWORD)
) as response:
print('Status Code:', response.status)
print('Body:', await response.text())
loop = asyncio.get_event_loop()
loop.run_until_complete(get_response_using_proxy())
This method separates the proxy URL and authentication credentials for clarity and better security practices.
Rotating Proxies to Avoid Blocks
Websites often block IPs that make too many requests. To handle this, rotating proxies is a common approach. While AIOHTTP doesn’t have built-in proxy rotation, you can implement it easily in Python.
Random Proxy Rotation
Store your proxies in a list, then randomly select one for each request:
import asyncio
import random
import aiohttp
proxy_list = [
'http://YourProxyPlanUsername:YourProxyPlanPassword@proxy1.example.com:10000',
'http://YourProxyPlanUsername:YourProxyPlanPassword@proxy2.example.com:10001',
# Add more proxies as needed
]
async def get_response_using_proxy(proxy):
async with aiohttp.ClientSession() as session:
async with session.get('https://ip-api.com/', proxy=proxy) as response:
print('Status Code:', response.status)
print('Body:', await response.text())
loop = asyncio.get_event_loop()
proxy = random.choice(proxy_list)
loop.run_until_complete(get_response_using_proxy(proxy))
Each run picks a random proxy from the list to distribute requests and minimize blocks.
Round-Robin Proxy Rotation
For predictable proxy usage, iterate through the proxy list in a cycling manner:
import asyncio
import aiohttp
proxy_list = [
'http://YourProxyPlanUsername:YourProxyPlanPassword@proxy1.example.com:10000',
'http://YourProxyPlanUsername:YourProxyPlanPassword@proxy2.example.com:10001',
# Add more proxies as needed
]
async def get_response_using_proxy(target_url, proxy):
async with aiohttp.ClientSession() as session:
async with session.get(target_url, proxy=proxy) as response:
print('Status Code:', response.status)
print('Body:', await response.text())
loop = asyncio.get_event_loop()
number_of_requests = 10
length = len(proxy_list)
for i in range(number_of_requests):
proxy = proxy_list[i % length]
loop.run_until_complete(get_response_using_proxy('https://ip-api.com/', proxy))
This method cycles through the proxy list in order for each request.
Reusing and Switching Proxies on Failure
You may want to reuse the same proxy until it triggers failure (e.g., blocked or error status), then switch:
import asyncio
import aiohttp
proxy_list = [
'http://YourProxyPlanUsername:YourProxyPlanPassword@proxy1.example.com:10000',
'http://YourProxyPlanUsername:YourProxyPlanPassword@proxy2.example.com:10001',
# Add more proxies
]
async def get_response_using_proxy(target_url, proxy):
async with aiohttp.ClientSession() as session:
async with session.get(target_url, proxy=proxy) as response:
print('Status Code:', response.status)
print('Body:', await response.text())
return response.status
loop = asyncio.get_event_loop()
index = 0
number_of_requests = 10
length = len(proxy_list)
for _ in range(number_of_requests):
proxy = proxy_list[index]
status_code = loop.run_until_complete(get_response_using_proxy('https://ip-api.com/', proxy))
if status_code != 200:
index = (index + 1) % length # Move to the next proxy
else:
# Continue using the current proxy
continue
This strategy keeps the same proxy until it fails, then rotates to a new one automatically.
Why Choose DataImpulse for Proxies?
DataImpulse delivers reliable residential proxies that are cost-effective and easy to integrate, making it an excellent fit for AIOHTTP users. With flexible authentication modes and support for IP rotation, your scraping or API requests can stay uninterrupted.
Additionally, DataImpulse has achieved certifications like ISO 27001 to assure data security and compliance, as well as awards recognizing their progress and service quality.
To explore affordable proxy plans and start your own proxy-powered applications, visit DataImpulse.
Summary
- Set up AIOHTTP to make asynchronous HTTP requests.
- Pass proxy details directly in the request or use BasicAuth for authentication.
- Rotate proxies randomly or with a round-robin method to avoid IP blocks.
- Switch proxies on failure to maintain continuous access.
- Use DataImpulse’s proxy services for trusted residential proxies.
Integrating proxies with AIOHTTP lets you build more robust and reliable web scrapers or clients that handle real-world restrictions gracefully.
If you're looking for affordable options to get started, DataImpulse offers proxy plans at just $1 per GB, making it easy to scale your scraping and testing projects without breaking the bank.
















