正在接收类型为"错误"的未注册任务



Im得到错误[2021-03-27 18:36:43,996: ERROR/MainProcess] Received unregistered task of type 'orders.tasks.order_created'. The message has been ignored and discarded.此错误仅发生在celery multi start w1 -A speedrealm -l DEBUG中。我的目标是让任务在后台运行。我已经厌倦了在不同的目录(top-=level和app_dir(中运行,我还尝试了对CELERY_IMPORTS进行注释/取消注释。我不确定这是否需要,但这也是conf.d。这里是所有关联的系统文件。

performancerealm.com
|---orders
|    | __init__.py
|    |-tasks.py
|    |-views.py
|
|---speedrealm
|    |- __init__.py
|    |- celery.py
|    |- settings.py 
|
|---manage.py
|--- # other apps

orders.py

from celery import shared_task 
from django.core.mail import send_mail 
from .models import Order 
@shared_task 
def order_created(order_id):
pass

speedreaminit.py

from .celery import app as celery_app
__all__=("celery_app",)

speedrerm/芹菜.py

import os 
from celery import Celery 
from django.conf import settings 
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'speedrealm.settings')
app = Celery('speedrealm')
app.config_from_object('django.conf:settings', namespace="CELERY")
app.autodiscover_tasks()

speedrarm.settings.py

CELERYD_NODES="w1"
CELERY_BIN="/home/bulofants/.local/share/virtualenvs/performancerealm.com-8nBM01mn/bin"
CELERY_APP="speedrealm"
CELERYD_CHIR="/home/bulofants/performancerealm.com"
CELERYD_OPTS="--time-limit=300 --concurrency=8"
CELERYD_LOG_FILE="/var/log/celery/%n%I.log"
CELERYD_PID_FILE="/var/run/celery/%n.pid"
CELERYD_USER="bulofants"
CELERYD_GROUP="bulofants"
CELERYD_LOG_LEVEL="INFO"
CELERY_CREATE_DIRS=1
CELERY_IMPORTS = [
'orders.tasks'
]
CELERY_TIMEZONE='US/Eastern'


/etc/systemd/system/celecer.service

[Unit]
Description=Celery Service 
After=network.target 
[Service]
Type=forking
User=bulofants
EnvironmentFile=/home/bulofants/sites/performancerealm.com 
WorkingDirectory=/home/bulofants/sites/performancerealm.com 
ExecStart=/home/bulofants/.local/share/virtualenvs/performancerealm.com-8nBM01mn/bin/celery  multi start ${CELERYD_NODES} -A ${CELERY_APP} --pidfile=${CELERYD_PID_FILE} --logfile=${CELERYD_LOG_FILE} --loglevel=${CELERYD_LOG_LEVEL} ${CELERYD_OPTS}
ExecStop=/home/bulofants/.local/share/virtualenvs/performancerealm.com-8nBM01mn/bin/celery ${CELERY_BIN} multi stopwait ${CELERYD_NODES} --pidfile=${CELERYD_PID_FILE}
ExecReload=/home/bulofants/.local/share/virtualenvs/performancerealm.com-8nBM01mn/bin/celery  ${CELERY_BIN} multi restart ${CELERYD_NODES} -A ${CELERY_APP} --pidfile=${CELERYD_PID_FILE} --logfile=${CELERYD_LOG_FILE} --loglevel=${CELERYD_LOG_LEVEL} ${CELERYD_OPTS}
[Install]
WantedBy=multi-user.target

orders.views.py

.tasks import order_created
order_created.delay()

只有当我在顶层运行celery -A speedrealm worker -l INFO时,这才会起作用。这个程序运行得很好。我之所以说是顶级的,是因为一些管理员说在包含任务的文件/应用程序中运行celene命令。如果我这样做,则不会返回ordersspeedrealm或配置错误的模块,并且模块中的其他导入将被标记为未定义,即设置(来自django.conf导入设置(。我使用的是Linux服务器,Ubuntu 16.04。我手动启动/重新启动这些工作程序,并始终使用守护进程重载命令。

init.py

from .celery import app as celery_app
__all__ = ["celery_app"]

celery.py

import os
from celery import Celery
from django.conf import settings
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "yourapp.settings")
app = Celery("yourapp")
app.config_from_object("django.conf:settings", namespace="CELERY")
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)

在设置中.py

CELERY_BROKER_URL = "amqp://localhost"  # or Redis instead of RabbitMQ
CELERY_TASK_SERIALIZER = "json"
CELERY_RESULT_SERIALIZER = "json"
CELERY_ACCEPT_CONTENT = ["json"]
CELERY_ENABLE_UTC = True
CELERY_BROKER_TRANSPORT_OPTIONS = {"visibility_timeout": 3600}

在任务中.py

from yourapp.celery import app
@app.task()
def my_task(parameters):
do_something()

在Linux中/etc/conf.d/电子

# single node
CELERYD_NODES="w1"
# or you could have three nodes:
#CELERYD_NODES="w1 w2 w3"
# Absolute or relative path to the 'celery' command:
CELERY_BIN="/usr/your_celery_path/bin/celery"
CELERY_APP="yourapp"
CELERYD_MULTI="multi"
CELERYD_OPTS="--time-limit=300 --concurrency=2"  # or your own concurrency
CELERYD_PID_FILE="/var/run/celery/%n.pid"
CELERYD_LOG_FILE="/var/log/celery/%n%I.log"
CELERYD_LOG_LEVEL="ERROR"  # or your own logging level

/etc/systemd/system/celery.service

[Unit]
Description=Celery Service
After=network.target
[Service]
Type=forking
User=celery-user
Group=your_webserver_group
EnvironmentFile=/etc/conf.d/celery
WorkingDirectory=/your_django_app_root_path/
ExecStart=/bin/sh -c '${CELERY_BIN} multi start ${CELERYD_NODES} 
-A ${CELERY_APP} --pidfile=${CELERYD_PID_FILE} -logfile=${CELERYD_LOG_FILE} 
--loglevel=${CELERYD_LOG_LEVEL} ${CELERYD_OPTS}'
ExecStop=/bin/sh -c '${CELERY_BIN} multi stopwait ${CELERYD_NODES} 
--pidfile=${CELERYD_PID_FILE}'
ExecReload=/bin/sh -c '${CELERY_BIN} multi restart ${CELERYD_NODES} 
-A ${CELERY_APP} --pidfile=${CELERYD_PID_FILE} 
--logfile=${CELERYD_LOG_FILE} --loglevel=${CELERYD_LOG_LEVEL} ${CELERYD_OPTS}'
Restart=always
[Install]
WantedBy=multi-user.target

创建芹菜用户(或您选择的其他用户名(,并将该用户添加到Web服务器组中。设置组访问Django应用程序文件夹的权限。

创建PID和Log文件夹,并授予它们Celery权限。

sudo useradd -r celery-user -s /sbin/nologin
sudo usermod -a -G your_webserver_group celery-user
mkdir /var/log/celery
chown -R celery-user:your_webserver_group /var/log/celery
mkdir /var/run/celery
chown -R celery-user:your_webserver_group /var/run/celery
sudo systemctl daemon-reload
sudo systemctl enable celery.service
sudo systemctl start celery.service

相关内容

  • 没有找到相关文章

最新更新