Celery任务未运行,并且卡在PENDING中



我正在学习互联网上的各种教程之一,并使用Docker/Docker Compose设置Flask/RabbitMQ/Celery应用程序。容器似乎都成功运行了,但当我到达端点时,应用程序就会暂停。该任务似乎被困在PENDING中,并且从未实际完成。Docker输出中没有错误,所以我真的很困惑为什么这不起作用。当我到达终点时,我看到的唯一输出是:

rabbit_1    | 2021-05-13 01:38:07.942 [info] <0.760.0> accepting AMQP connection <0.760.0> (172.19.0.4:45414 -> 172.19.0.2:5672)
rabbit_1    | 2021-05-13 01:38:07.943 [info] <0.760.0> connection <0.760.0> (172.19.0.4:45414 -> 172.19.0.2:5672): user 'rabbitmq' authenticated and granted access to vhost '/'
rabbit_1    | 2021-05-13 01:38:07.952 [info] <0.776.0> accepting AMQP connection <0.776.0> (172.19.0.4:45416 -> 172.19.0.2:5672)
rabbit_1    | 2021-05-13 01:38:07.953 [info] <0.776.0> connection <0.776.0> (172.19.0.4:45416 -> 172.19.0.2:5672): user 'rabbitmq' authenticated and granted access to vhost '/'

我真的不确定我做错了什么,因为文档没有太大帮助。

Dockerfile

FROM python:3
COPY ./requirements.txt /app/requirements.txt
WORKDIR /app
RUN pip install -r requirements.txt
COPY . /app
ENTRYPOINT [ "python" ]
CMD ["app.py","--host=0.0.0.0"]

烧瓶app.py

from workerA import add_nums
from flask import (
Flask,
request,
jsonify,
)
app = Flask(__name__)

@app.route("/add")
def add():
first_num = request.args.get('f')
second_num = request.args.get('s')
result = add_nums.delay(first_num, second_num)
return jsonify({'result': result.get()}), 200

if __name__ == '__main__':
app.run(debug=True, host='0.0.0.0')

芹菜workerA.py

from celery import Celery
# Celery configuration
CELERY_BROKER_URL = 'amqp://rabbitmq:rabbitmq@rabbit:5672/'
CELERY_RESULT_BACKEND = 'rpc://'
# Initialize Celery
celery = Celery('workerA', broker=CELERY_BROKER_URL, backend=CELERY_RESULT_BACKEND)

@celery.task()
def add_nums(a, b):
return a + b

docker-compose.yml

version: "3"
services:
web:
build:
context: .
dockerfile: Dockerfile
restart: always
ports:
- "5000:5000"
depends_on:
- rabbit
volumes:
- .:/app
rabbit:
hostname: rabbit
image: rabbitmq:management
environment:
- RABBITMQ_DEFAULT_USER=rabbitmq
- RABBITMQ_DEFAULT_PASS=rabbitmq
ports:
- "5673:5672"
- "15672:15672"
worker_1:
build:
context: .
hostname: worker_1
entrypoint: celery
command: -A workerA worker --loglevel=info -Q workerA
volumes:
- .:/app
links:
- rabbit
depends_on:
- rabbit

好吧,经过大量研究,我确定问题在于任务的队列名称。Celery使用了队列的默认名称,这导致了一些问题。我调整了我的装饰像这样:

@celery.task(queue='workerA')
def add_nums(a, b):
return a + b

现在它起作用了!

相关内容

  • 没有找到相关文章

最新更新