姜戈 + 芹菜 + 主管 + Redis 设置时出错



我正在 CentOS 服务器上完成以下组件的设置。我得到了主管任务来启动和运行网站,但我无法为芹菜设置主管。它似乎可以识别任务,但是当我尝试执行任务时,它不会连接到它们。我的 redis 已在端口 6380 上启动并运行

Django==1.10.3
amqp==1.4.9
billiard==3.3.0.23
celery==3.1.25
kombu==3.0.37
pytz==2016.10

我的芹菜.ini

[program:celeryd]
command=/root/myproject/myprojectenv/bin/celery worker -A mb --loglevel=INFO

environment=PATH="/root/myproject/myprojectenv/bin/",VIRTUAL_ENV="/root/myproject/myprojectenv",PYTHONPATH="/root/myproject/myprojectenv/lib/python2.7:/root/myproject/myprojectenv/lib/python2.7/site-packages"
directory=/home/.../myapp/
user=nobody
numprocs=1
stdout_logfile=/home/.../myapp/log_celery/worker.log
sterr_logfile=/home/.../myapp/log_celery/worker.log
autostart=true
autorestart=true
startsecs=10
; Need to wait for currently executing tasks to finish at shutdown.
; Increase this if you have very long running tasks.
stopwaitsecs = 1200
; When resorting to send SIGKILL to the program to terminate it
; send SIGKILL to its whole process group instead,
; taking care of its children as well.
killasgroup=true
; Set Celery priority higher than default (999)
; so, if rabbitmq(redis) is supervised, it will start first.
priority=1000

该过程开始,当我转到项目文件夹并执行以下操作时:

>python manage.py celery status
celery@ssd-1v: OK
1 node online.

当我打开芹菜的日志文件时,我看到任务已加载。

[tasks]
. mb.tasks.add
. mb.tasks.update_search_index
. orders.tasks.order_created

我的 MB/任务.py

from mb.celeryapp import app
import django
django.setup()
@app.task
def add(x, y):
print(x+y)
return x + y

我的MB/芹菜.py

from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
from django.conf import settings
# set the default Django settings module for the 'celery' program.
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "mb.settings")
app = Celery('mb', broker='redis://localhost:6380/', backend='redis://localhost:6380/')
app.conf.broker_url = 'redis://localhost:6380/0'
app.conf.result_backend = 'redis://localhost:6380/'
app.conf.timezone = 'Europe/Sofia'
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)

我的MB/设置.py:

...
WSGI_APPLICATION = 'mb.wsgi.application'
BROKER_URL = 'redis://localhost:6380/0'
CELERYBEAT_SCHEDULER = 'djcelery.schedulers.DatabaseScheduler'
STATICFILES_STORAGE = 'whitenoise.storage.CompressedManifestStaticFilesStorage'
...

当我运行时:

python manage.py shell
>>> from mb.tasks import add
>>> add.name
'mb.tasks.add'
>>> result=add.delay(1,1)
>>> result.ready()
False
>>> result.status
'PENDING'

如前所述,我不再看到日志中的任何更改。 如果我尝试从命令行运行:

/root/myproject/myprojectenv/bin/celery worker -A mb --loglevel=INFO
Running a worker with superuser privileges when the
worker accepts messages serialized with pickle is a very bad idea!
If you really want to continue then you have to set the C_FORCE_ROOT
environment variable (but please think about this before you do).
User information: uid=0 euid=0 gid=0 egid=0

但我想这是正常的,因为我在没有人的情况下运行它。有趣的是,命令只是芹菜状态(没有python manage.py 芹菜状态(在连接时出错,可能是因为它正在为redis寻找不同的端口,但是主管的过程正常启动... 当我打电话给"芹菜工人-A mb"时,它说没关系。有什么想法吗?

(myprojectenv) [root@ssd-1v]# celery status                                      
Traceback (most recent call last):                                                          
File "/root/myproject/myprojectenv/bin/celery", line 11, in <module>                      
sys.exit(main())                                                                        
File "/root/myproject/myprojectenv/lib/python2.7/site-packages/celery/__main__.py", line 3
0, in main                                                                                  
main()                                                                                  
File "/root/myproject/myprojectenv/lib/python2.7/site-packages/celery/bin/celery.py", line
81, in main                                                                                
cmd.execute_from_commandline(argv)                                                      
File "/root/myproject/myprojectenv/lib/python2.7/site-packages/celery/bin/celery.py", line
793, in execute_from_commandline                                                           
super(CeleryCommand, self).execute_from_commandline(argv)))                             
File "/root/myproject/myprojectenv/lib/python2.7/site-packages/celery/bin/base.py", line 3
11, in execute_from_commandline                                                             
return self.handle_argv(self.prog_name, argv[1:])                                       
File "/root/myproject/myprojectenv/lib/python2.7/site-packages/celery/bin/celery.py", line
785, in handle_argv                                                                        
return self.execute(command, argv)                                                      
File "/root/myproject/myprojectenv/lib/python2.7/site-packages/celery/bin/celery.py", line
717, in execute                                                                            
).run_from_argv(self.prog_name, argv[1:], command=argv[0])                              
File "/root/myproject/myprojectenv/lib/python2.7/site-packages/celery/bin/base.py", line 3
15, in run_from_argv                                                                        
sys.argv if argv is None else argv, command)                                            
File "/root/myproject/myprojectenv/lib/python2.7/site-packages/celery/bin/base.py", line 3
77, in handle_argv                                                                          
return self(*args, **options)                                                           
File "/root/myproject/myprojectenv/lib/python2.7/site-packages/celery/bin/base.py", line 2
74, in __call__                                                                             
ret = self.run(*args, **kwargs)                                                         
File "/root/myproject/myprojectenv/lib/python2.7/site-packages/celery/bin/celery.py", line
473, in run                                                                                
replies = I.run('ping', **kwargs)                                                       
File "/root/myproject/myprojectenv/lib/python2.7/site-packages/celery/bin/celery.py", line
325, in run                                                                                
return self.do_call_method(args, **kwargs)                                              
File "/root/myproject/myprojectenv/lib/python2.7/site-packages/celery/bin/celery.py", line
347, in do_call_method                                                                     
return getattr(i, method)(*args)
File "/root/myproject/myprojectenv/lib/python2.7/site-packages/celery/app/control.py", line 100, in ping
return self._request('ping')
File "/root/myproject/myprojectenv/lib/python2.7/site-packages/celery/app/control.py", line 71, in _request
timeout=self.timeout, reply=True,
File "/root/myproject/myprojectenv/lib/python2.7/site-packages/celery/app/control.py", line 316, in broadcast
limit, callback, channel=channel,
File "/root/myproject/myprojectenv/lib/python2.7/site-packages/kombu/pidbox.py", line 283, in _broadcast
chan = channel or self.connection.default_channel
File "/root/myproject/myprojectenv/lib/python2.7/site-packages/kombu/connection.py", line 771, in default_channel
self.connection
File "/root/myproject/myprojectenv/lib/python2.7/site-packages/kombu/connection.py", line 756, in connection
self._connection = self._establish_connection()
File "/root/myproject/myprojectenv/lib/python2.7/site-packages/kombu/connection.py", line 711, in _establish_connection
conn = self.transport.establish_connection()
File "/root/myproject/myprojectenv/lib/python2.7/site-packages/kombu/transport/pyamqp.py", line 116, in establish_connection
conn = self.Connection(**opts)
File "/root/myproject/myprojectenv/lib/python2.7/site-packages/amqp/connection.py", line 165, in __init__
self.transport = self.Transport(host, connect_timeout, ssl)
File "/root/myproject/myprojectenv/lib/python2.7/site-packages/amqp/connection.py", line 186, in Transport
return create_transport(host, connect_timeout, ssl)
File "/root/myproject/myprojectenv/lib/python2.7/site-packages/amqp/transport.py", line 299, in create_transport
return TCPTransport(host, connect_timeout)
File "/root/myproject/myprojectenv/lib/python2.7/site-packages/amqp/transport.py", line 95, in __init__
raise socket.error(last_err)
socket.error: [Errno 111] Connection refused

任何帮助将不胜感激。

更新:

当我跑步时

$:python manage.py shell
>>from mb.tasks import add
>>add
<@task: mb.tasks.add of mb:0x**2b3f6d0**>

0x2b3f6d0与 celery 在其日志中声称的内存空间不同,即:

[config]
- ** ---------- .> app:         mb:0x3495bd0
- ** ---------- .> transport:   redis://localhost:6380/0
- ** ---------- .> results:     disabled://
- *** --- * --- .> concurrency: 1 (prefork)

好的,在这种情况下的答案是 gunicorn 文件实际上是从公共 python 库而不是虚拟环境启动项目的。

相关内容

  • 没有找到相关文章

最新更新