在 Python 中使用多处理处理处理大文件:如何每个进程仅加载一次资源



Python 的multiprocessing.Pool.imap非常方便逐行处理大文件:

import multiprocessing
def process(line):
    processor = Processor('some-big.model') # this takes time to load...
    return processor.process(line)
if __name__ == '__main__':
    pool = multiprocessing.Pool(4)
    with open('lines.txt') as infile, open('processed-lines.txt', 'w') as outfile:
        for processed_line in pool.imap(process, infile):
            outfile.write(processed_line)

如何确保帮助程序(如上例中的 Processor(仅加载一次?在不诉诸涉及队列的更复杂/冗长的结构的情况下,这是否可能?

multiprocessing.Pool允许通过initializerinitarg参数进行资源初始化。我很惊讶地发现这个想法是利用全局变量,如下图所示:

import multiprocessing as mp
def init_process(model):
    global processor
    processor = Processor(model) # this takes time to load...
def process(line):
    return processor.process(line) # via global variable `processor` defined in `init_process`
if __name__ == '__main__':
    pool = mp.Pool(4, initializer=init_process, initargs=['some-big.model'])
    with open('lines.txt') as infile, open('processed-lines.txt', 'w') as outfile:
        for processed_line in pool.imap(process, infile):
            outfile.write(processed_line)

这个概念在multiprocessing.Pool的文档中没有很好的描述,所以我希望这个例子对其他人有所帮助。

相关内容

  • 没有找到相关文章

最新更新