Python Celery - 如何在其他任务中调用芹菜任务



我在 Django-Celery 中调用任务中的任务

这是我的任务。

@shared_task
def post_notification(data,url):
    url = "http://posttestserver.com/data/?dir=praful" # when in production, remove this line.
    headers = {'content-type': 'application/json'}
    requests.post(url, data=json.dumps(data), headers=headers)

@shared_task
def shipment_server(data,notification_type):
    notification_obj = Notification.objects.get(name = notification_type)
    server_list = ServerNotificationMapping.objects.filter(notification_name=notification_obj)
    for server in server_list:
        task = post_notification.delay(data,server.server_id.url)
        print task.status # it prints 'Nonetype' has no attribute id

如何在任务中调用任务?我在某处读到可以使用group来完成,但我无法形成正确的语法。我该怎么做?

我试过这个

for server in server_list:
    task = group(post_notification.s(data, server.server_id.url))().get()
    print task.status

抛出警告说

TxIsolationWarning: Polling results w│                                                                        
ith transaction isolation level repeatable-read within the same transacti│                                                                        
on may give outdated results. Be sure to commit the transaction for each │                                                                        
poll iteration.                                                          │                                                                        
  'Polling results with transaction isolation level '

不知道这是什么!!

如何解决我的问题?

这应该有效:

celery.current_app.send_task('mymodel.tasks.mytask', args=[arg1, arg2, arg3])

你是对的,因为你for循环中的每个任务都将被task变量覆盖。

你可以尝试celery.group喜欢

from celery import group

@shared_task
def shipment_server(data,notification_type):
    notification_obj = Notification.objects.get(name = notification_type)
    server_list = ServerNotificationMapping.objects.filter(notification_name=notification_obj)

    tasks = [post_notification.s(data, server.server_id.url) for server in server_list]
    results = group(tasks)()
    print results.get() # results.status() what ever you want

您可以使用延迟函数从任务调用任务

from app.tasks import celery_add_task
    celery_add_task.apply_async(args=[task_name]) 

。它会工作

相关内容

  • 没有找到相关文章

最新更新