如何将仅关键字参数应用于多处理池中的函数?



>我有一个函数,它接受一个仅关键字参数,并希望在进程池中运行它。如何将我的条目作为关键字参数从可迭代对象传递到进程中的函数?

import multiprocessing
greetees = ('Foo', 'Bar')
def greet(*, greetee):
return f'Hello, {greetee}!'

我尝试使用multiprocessing.map:

greetings = multiprocessing.Pool(2).map(greet, greetees)
for greeting in greetings:
print(greeting)

但这引发了一个例外,正如预期的那样:

multiprocessing.pool.RemoteTraceback: 
"""
Traceback (most recent call last):
File "/usr/lib/python3.6/multiprocessing/pool.py", line 119, in worker
result = (True, func(*args, **kwds))
File "/usr/lib/python3.6/multiprocessing/pool.py", line 44, in mapstar
return list(map(*args))
TypeError: greet() takes 0 positional arguments but 1 was given
"""
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/home/bengt/Projekte/gitlab.com/PFASDR/PFASDR.Code.Main/pfasdr/neural/multi_pool_kwargs.py", line 10, in <module>
greetings = multiprocessing.Pool(2).map(greet, greetees)
File "/usr/lib/python3.6/multiprocessing/pool.py", line 266, in map
return self._map_async(func, iterable, mapstar, chunksize).get()
File "/usr/lib/python3.6/multiprocessing/pool.py", line 644, in get
raise self._value
TypeError: greet() takes 0 positional arguments but 1 was given

如果我删除星号以不要求参数仅是关键字,则可以正常工作:

[...]
def greet(greetee):
return f'Hello, {greetee}!'
[...]

输出:

Hello, Foo!
Hello, Bar!

这里的解决方案是使用Pool.applyPool.apply_async

greetings = list(
multiprocessing.Pool(2).apply(greet, kwds={'greetee': greetees[i]})
for i in range(len(greetees))
)
for greeting in greetings:
print(greeting)

输出:

Hello, Foo!
Hello, Bar!

由Mad Physicist和这个QnA提供,可以使用functools.partial 将仅关键字参数注入函数:

from functools import partial
greetings = []
for i in range(len(greetees)):
kwargs = {'greetee': greetees[i]}
greet_partial = partial(greet, **kwargs)
greetings.append(multiprocessing.Pool(2).apply(greet_partial))

或者出血变化较小:

from functools import partial
greetings = [
multiprocessing.Pool(2).apply(
partial(greet, **{'greetee': greetees[i]})
)
for i in range(len(greetees))
]

相关内容

  • 没有找到相关文章

最新更新