python多处理正在失去价值



我试图使用多处理包中的Pool来加快计算速度。虽然我确实得到了显著的加速,但随着核心/员工数量的增加,我错过了越来越多的价值。

我通过mp.value((类与所有进程共享我的变量。

我哪里出了问题?我该如何解决?

poss = [x+1 for x in range(20)]
all_rolls = itertools.product(poss, repeat=6)
win = mp.Value('i', 0)
draw = mp.Value('i', 0)
loose = mp.Value('i', 0)
def some_func(roll):
if(comparison on rolls):
win.value += 1
elif(other comparison):
draw.value +=1
else:
loose.value +=1
with Pool(8) as p:
p.map(some_func, all_rolls)

在16个核心上,我得到了55923638个值,而不是64000000个

您需要使用Lock保护对值的修改(请参阅本文(。

from multiprocessing import Lock
lock = Lock()
def some_func(roll):
with lock:
if(comparison on rolls):
win.value += 1
elif(other comparison):
draw.value +=1
else:
loose.value +=1

除了@jfowkes的回答之外,请注意,您可以将每个Value与自己的锁一起使用,这可能会使事情变得更快:

win = mp.Value('i', lock = True)
draw = mp.Value('i', lock = True)
loose = mp.Value('i', lock = True)
def some_func(roll):
if(comparison on rolls):
with win.get_lock() : 
win.value += 1
elif(other comparison):
with draw.get_lock():
draw.value +=1
else:
with loose.get_lock():
loose.value +=1

最新更新