我无法弄清楚为什么我的任务在所有子任务完成之前都被认为是完成的。
tasks.scan_user.delay(1)
法典:
@task()
def scan_chunk(ids):
occs = Occurence.objects.filter(product_id__in=ids)
result = scan_occurences_dummy_pool((x.id,x.url,x.xpath) for x in occs)
return result
@task()
def scan_user(id):
chunks = generate_chunks_from_user(id)
ch = chain(scan_chunk.si([x.id for x in chunk]) for chunk in chunks)
return ch()
如您所见,这是 Celery 输出,scan_user
在所有scan_chunks
完成之前就成功了,这是一个问题,因为我想在另一个chain
中向我们scan_user
。
[2017-02-09 14:27:03,493: INFO/MainProcess] Received task: engineapp.tasks.scan_user[ed358a98-a685-4002-baac-993fdc7b64cf]
[2017-02-09 14:27:05,721: INFO/MainProcess] Received task: engineapp.tasks.scan_chunk[35b74e01-f9fa-471f-8c20-ecbf99a89201]
[2017-02-09 14:27:06,740: INFO/MainProcess] Task engineapp.tasks.scan_user[ed358a98-a685-4002-baac-993fdc7b64cf] succeeded in 3.24300003052s: <AsyncResult: 442f9373-d983-4696-a42a-ba42a8ce7761>
[2017-02-09 14:27:22,178: INFO/MainProcess] Received task: engineapp.tasks.scan_chunk[36a94ad4-3c9e-4f7d-a040-5c2a617a0d8f]
[2017-02-09 14:27:23,204: INFO/MainProcess] Task engineapp.tasks.scan_chunk[35b74e01-f9fa-471f-8c20-ecbf99a89201] succeeded in 17.4779999256s: [
我想创建另一个task
,它将为所有用户scan_user
顺序运行,但我认为这是不可能的,因为它实际上是并行的。
ch()
只是运行链而不等待结果。如果要等待,请执行以下操作:
ch = chain(scan_chunk.si([x.id for x in chunk]) for chunk in chunks)()
return ch.get()