芹菜状态停留在待处理状态



我在这里找到了使用 mongodb 作为后端结果运行芹菜的示例原始代码示例。在他的示例中,他CELERYBEAT_SCHEDULE了一些每分钟运行一次的参数,就我而言,我只是注释掉了这段代码。就我而言,我只想在收到任务后立即运行任务。从工作日志中,我什至看不到任务已收到,结果状态的输出为挂起。为什么这停留在待处理状态并且没有完成任务。这是一个简单的添加任务,所以我无法想象它需要很长时间。

另一件事是我有虚拟环境,所以从告诉我的情况来看,我应该像"芹菜多启动工人 --loglevel=info" 一样运行芹菜

我是芹菜的新手,这让我有点困惑。提前感谢任何帮助。

celeryconfig.py 文件

# from celery.schedules import crontab
CELERY_RESULT_BACKEND = "mongodb"
CELERY_MONGODB_BACKEND_SETTINGS = {
    "host": "127.0.0.1",
    "port": 27017,
    "database": "jobs", 
    "taskmeta_collection": "stock_taskmeta_collection",
}
# this was part of the original code but i commented out in hopes 
# it would run the task right away and not delay.
#
#used to schedule tasks periodically and passing optional arguments 
#Can be very useful. Celery does not seem to support scheduled task but only periodic
# CELERYBEAT_SCHEDULE = {
#     'every-minute': {
#         'task': 'tasks.add',
#         'schedule': crontab(minute='*/1'),
#         'args': (1,2),
#     },
# }

tasks.py 文件

from celery import Celery
import time 
#Specify mongodb host and datababse to connect to
BROKER_URL = 'mongodb://localhost:27017/jobs'
celery = Celery('EOD_TASKS',broker=BROKER_URL)
#Loads settings for Backend to store results of jobs 
celery.config_from_object('celeryconfig')
@celery.task
def add(x, y):
    time.sleep(5)
    return x + y

# starting celery
celery multi start worker --loglevel=info
celery multi v4.1.0 (latentcall)
> Starting nodes...
    > worker@lnx-v2: OK

运行芹菜任务

lnx-v2:171> python
Python 3.4.1 (default, Nov 12 2014, 13:34:48) 
[GCC 4.4.6 20120305 (Red Hat 4.4.6-4)] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> from tasks import add
>>> result = add.delay(1,1)
>>> result
<AsyncResult: 8e6ee263-d8a4-4b17-8d7a-9873b6c98473>
>>> result.status
'PENDING'

工作线程日志

lnx-v2:208> tail -f worker.log
[2017-10-26 13:41:15,658: INFO/MainProcess] mingle: all alone
[2017-10-26 13:41:15,683: INFO/MainProcess] worker@lnx-v2 ready.
[2017-10-26 13:45:50,465: INFO/MainProcess] Connected to amqp://guest:**@127.0.0.1:5672//
[2017-10-26 13:45:50,487: INFO/MainProcess] mingle: searching for neighbors
[2017-10-26 13:45:51,522: INFO/MainProcess] mingle: all alone
[2017-10-26 13:45:51,540: INFO/MainProcess] worker@lnx-v2 ready.
[2017-10-26 13:47:13,169: INFO/MainProcess] Connected to amqp://guest:**@127.0.0.1:5672//
[2017-10-26 13:47:13,191: INFO/MainProcess] mingle: searching for neighbors
[2017-10-26 13:47:14,228: INFO/MainProcess] mingle: all alone
[2017-10-26 13:47:14,254: INFO/MainProcess] worker@lnx-v2 ready.

# Celery process
lnx-v2:209> ps -ef | grep celery
15096     1  0 13:47 ?        00:00:00 [celeryd: worker@lnx-v2:MainProcess] -active- (worker --loglevel=info --logfile=worker%I.log --pidfile=worker.pid --hostname=worker@lnx-v2)
15157 15096  0 13:47 ?        00:00:00 [celeryd: worker@lnx-v2:ForkPoolWorker-1]

从以下代码中查看 add 方法是否列在芹菜任务中

celery.tasks.keys()

我认为你必须用括号结束的装饰器

@celery.task()
def add(x, y):
    time.sleep(5)
    return x + y

相关内容

  • 没有找到相关文章

最新更新