我试图在spyder3上运行一个多处理函数。在它不会在循环内打印任何内容并且无限卡住之后,我读到我可以像这样在 spyder 的外部终端上运行它。
Run > Configuration per file > Execute in an external system terminal
现在它终于向我展示了一些东西。可悲的是:
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "c:usersbenappdatalocalprogramspythonpython37libmultiprocessingspawn.py", line 105, in spawn_main
exitcode = _main(fd)
File "c:usersbenappdatalocalprogramspythonpython37libmultiprocessingspawn.py", line 114, in _main
prepare(preparation_data)
File "c:usersbenappdatalocalprogramspythonpython37libmultiprocessingspawn.py", line 225, in prepare
_fixup_main_from_path(data['init_main_from_path'])
File "c:usersbenappdatalocalprogramspythonpython37libmultiprocessingspawn.py", line 277, in _fixup_main_from_path
run_name="__mp_main__")
File "c:usersbenappdatalocalprogramspythonpython37librunpy.py", line 263, in run_path
pkg_name=pkg_name, script_name=fname)
File "c:usersbenappdatalocalprogramspythonpython37librunpy.py", line 96, in _run_module_code
mod_name, mod_spec, pkg_name, script_name)
File "c:usersbenappdatalocalprogramspythonpython37librunpy.py", line 85, in _run_code
exec(code, run_globals)
File "C:UsersbenArc_Projectlöschbarer_test_der_mp_func.py", line 161, in <module>
cpus,closeIdx=valueSeries.index,t1=df['t1'])
File "C:UsersbenArc_Projectlöschbarer_test_der_mp_func.py", line 27, in mpPandasObj
out=processJobs(jobs,numThreads=numThreads)
File "C:UsersbenArc_Projectlöschbarer_test_der_mp_func.py", line 75, in processJobs
pool=mp.Pool(processes=numThreads)
File "c:usersbenappdatalocalprogramspythonpython37libmultiprocessingcontext.py", line 119, in Pool
context=self.get_context())
File "c:usersbenappdatalocalprogramspythonpython37libmultiprocessingpool.py", line 176, in __init__
self._repopulate_pool()
File "c:usersbenappdatalocalprogramspythonpython37libmultiprocessingpool.py", line 241, in _repopulate_pool
w.start()
File "c:usersbenappdatalocalprogramspythonpython37libmultiprocessingprocess.py", line 112, in start
self._popen = self._Popen(self)
File "c:usersbenappdatalocalprogramspythonpython37libmultiprocessingcontext.py", line 322, in _Popen
return Popen(process_obj)
File "c:usersbenappdatalocalprogramspythonpython37libmultiprocessingpopen_spawn_win32.py", line 46, in __init__
prep_data = spawn.get_preparation_data(process_obj._name)
File "c:usersbenappdatalocalprogramspythonpython37libmultiprocessingspawn.py", line 143, in get_preparation_data
_check_not_importing_main()
File "c:usersbenappdatalocalprogramspythonpython37libmultiprocessingspawn.py", line 136, in _check_not_importing_main
is not going to be frozen to produce an executable.''')
而这个 occours 在一个(可能是无限的(循环中:
RuntimeError:
An attempt has been made to start a new process before the
current process has finished its bootstrapping phase.
This probably means that you are not using fork to start your
child processes and you have forgotten to use the proper idiom
in the main module:
if __name__ == '__main__':
freeze_support()
...
The "freeze_support()" line can be omitted if the program
is not going to be frozen to produce an executable.
这是多处理函数:
def processJobs(jobs,task=None,numThreads=24):
# Run in parallel.
# jobs must contain a ’func’ callback, for expandCall
if task is None:
task=jobs[0]['func'].__name__
pool=mp.Pool(processes=numThreads)
outputs,out,time0=pool.imap_unordered(expandCall,jobs),[],time.time()
# Process asynchronous output, report progress
for i,out_ in enumerate(outputs,1): # <---- Here.
out.append(out_)
reportProgress(i,len(jobs),time0,task)
pool.close(); pool.join() # this is needed to prevent memory leaks return out
return out
我该如何解决这个问题?(我真的需要多处理(
编辑:在Pycharm上尝试过 ->它报告相同的错误
我刚刚遇到了这个。您可能直接在 Python 脚本中直接调用processJobs()
。
你只需要像这样添加主要成语:
if __name__ == '__main__':
processJobs()
我怀疑多处理会在操作系统中分叉/生成多个 Python 进程并加载模块,如果没有这个守卫会导致递归调用Pool
实例化。
拥有主防护可确保Pool
实例化仅运行一次。