什么是最快发布一百万个请求与url,头,和正文?



我有静态url、标题和数据。是否有可能用python同时发出数百万个post请求?这是文件:

import json
import requests
url = "https://abcd.com"
headers = "headers"
body = "body"
resp = requests.post(url, headers=headers, data=body)
json_resp = json.loads(resp.content)["data"]
print(json_resp)

您可能需要使用一些python工具,例如:

  • https://locust.io/

你的文件看起来像:

from locust import HttpUser, task, between
class QuickstartUser(HttpUser):

@task
def task_name(self):
self.client.post(url, headers=headers, data=body)

你可以这样喂蝗虫:

locust --headless --users <number_of_user> -f <your_file.py>

您可以通过几种方式做到这一点,这是async工作的最佳方法和思想第二种方法是ThreadPoolExecutor,我不强烈推荐

有一个这样做的例子

# modified fetch function with semaphore
import random
import asyncio
from aiohttp import ClientSession
async def fetch(url, session):
async with session.get(url) as response:
delay = response.headers.get("DELAY")
date = response.headers.get("DATE")
print("{}:{} with delay {}".format(date, response.url, delay))
return await response.read()

async def bound_fetch(sem, url, session):
# Getter function with semaphore.
async with sem:
await fetch(url, session)

async def run(r):
url = "http://localhost:8080/{}"
tasks = []
# create instance of Semaphore
sem = asyncio.Semaphore(1000)
# Create client session that will ensure we dont open new connection
# per each request.
async with ClientSession() as session:
for i in range(r):
# pass Semaphore and session to every GET request
task = asyncio.ensure_future(bound_fetch(sem, url.format(i), session))
tasks.append(task)
responses = asyncio.gather(*tasks)
await responses
number = 10000
loop = asyncio.get_event_loop()
future = asyncio.ensure_future(run(number))
loop.run_until_complete(future)

最新更新