Celery连接到rabbitmq服务器,而不是redis服务器



我有一个Django应用程序,我想将其配置为运行后台任务。

包装:

  1. 芹菜==4.2.1

  2. Django==2.1.3

  3. Python==3.5

  4. Redis服务器==3.0.6

设置中的芹菜配置。py文件为:

CELERY_BROKER_URL = 'redis://localhost:6379'
CELERY_RESULT_BACKEND = 'redis://localhost:6379'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TASK_SERIALIZER = 'json'
CELERY_TIMEZONE = 'Asia/Kolkata'
CELERY_BEAT_SCHEDULE = {
'task-number-one': {
'task': 'app.tasks.task_number_one',
'schedule': crontab(minute='*/1'),
},
}

芹菜.py文件:

from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
from django.conf import settings
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'project.settings.prod')
app = Celery('project')
# Using a string here means the worker don't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
#   should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings')
# Load task modules from all registered Django app configs.
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)

@app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))

运行时:celery -A project worker -l info -B -E

它指向rammitmq服务器,而应该指向redis服务器,如下所示:

-------------- celery@user-desktop v4.2.1 (windowlicker)
---- **** ----- 
--- * ***  * -- Linux-4.15.0-39-generic-x86_64-with-Ubuntu-18.04-bionic 2018-11-21 12:04:51
-- * - **** --- 
- ** ---------- [config]
- ** ---------- .> app:         project:0x7f8b80f78d30
- ** ---------- .> transport:   amqp://guest:**@localhost:5672//
- ** ---------- .> results:     redis://localhost:6379/
- *** --- * --- .> concurrency: 4 (prefork)
-- ******* ---- .> task events: ON
--- ***** ----- 
-------------- [queues]
.> celery           exchange=celery(direct) key=celery

[tasks]
. app.tasks.task_number_one
. project.celery.debug_task
[2018-11-21 12:04:51,741: INFO/Beat] beat: Starting...

同样的情况也发生在生产环境中。在生产中,我已经使用Gunicorn和Nginx部署了Django应用程序,现在我想实现一些运行后台任务的方法,因为django-crontab包不起作用。

问题:

  1. 芹菜配置有什么问题?

  2. 有人能推荐一种运行定期后台任务的方法吗?

**注意:我尝试过实现supervisor,但supervisor似乎与python3不兼容,因此无法配置它。

代理url的设置在v4中发生了更改。它应该是BROKER_URL而不是CELERY_BROKER_URL

如果您已经从芹菜官方网站https://docs.celeryproject.org/en/latest/django/first-steps-with-django.html复制了celery.py的内容

尝试从更改以下行

app.config_from_object('django.conf:settings', namespace='CELERY')

app.config_from_object('django.conf:settings', namespace='')

替换CELERY_BROKER_URL = 'redis://localhost:6379'CCD_ 10。这对我有效。

BROKER_URL更改为CELERY_BROKER_URL后,必须在celery.py中更改该行

app = Celery('proj')

添加'backend='redis://localhost', broker='redis://',使其看起来像这个

app = Celery('proj', backend='redis://localhost', broker='redis://')

现在它将工作:(

如果你有redis作为broker,使用.delay((方法来查询任务,并出现奇怪的连接错误111拒绝连接到rabbitmq(你根本不使用(,请尝试使用.apply_async((

这种行为发生在生产中。

最新更新