我成功地从csv文件中逐个发布每条记录。但是,我正在尝试实现多处理,以便将来更有效地处理大型数据文件。
ENDPOINT_URL = 'https://example.com'
headers = {'Api-key': '123abc'}
with open("student.csv", "r") as csv_ledger:
r = csv.DictReader(csv_ledger)
data = [dict(d) for d in r ]
groups = {}
for k, g in groupby(data, lambda r: (r['name'])):
#My data mapping
#for loop to post each record
post_api = requests.post(ENDPOINT_URL, json=groups, headers=headers)
有没有新的简单方法来对 api 请求进行多处理?
更新:我尝试使用该grequest
但我发布的数据为空
rs = (grequests.post(u,json=groups, headers=headers) for u in ENDPOINT_URL)
grequests.map(rs)
print(grequests.map(rs))
您可以使用
threading
库。
ENDPOINT_URL = 'https://example.com'
headers = {'Api-key': '123abc'}
import threading
with open("student.csv", "r") as csv_ledger:
r = csv.DictReader(csv_ledger)
data = [dict(d) for d in r ]
groups = {}
for k, g in groupby(data, lambda r: (r['name'])):
#My data mapping
t = threading.Thread(target=requests.post,
args=(ENDPOINT_URL, groups, headers=headers))
t.setDaemon(True)
t.start()
注意:setDaemon(True)
将使线程在完成作业后自动杀死自己。
更新:如果要传递关键字参数,请检查此处:关键字参数线程