芹菜Django任务ModulenotFoundError



我正在遇到一个奇怪的模型founderror。

我正在制作一个docker-compose网络,在该网络中,通过共享卷之间的容器之间共享任务。卷看起来像这样:

celery_use
│   celery_config.py
│   tasks.py
│   __init__.py
#  celery_config.py
import os
from celery import Celery
app = Celery(include=('celery_use.tasks',))
#  tasks.py
from celery.utils.log import get_task_logger
from .celery_config import app
import os
logger = get_task_logger(__name__)

@app.task(bind=True, name='process_availability', queue='availability')
def process_availability(self, message):
    print(os.getcwd())
    print(os.listdir())
    from avail.models import AvailabilityConfirmation

我的docker-compose文件看起来像这样:

version: '3.3'
services:
  pas-gateway:
    build:
      context: ./
      dockerfile: Dockerfile.pas
    command: bash -c "python appserver.py"
    environment: &env
      - CELERY_BROKER_URL=amqp://guest:guest@rabbitmq:5672
    depends_on:
      - rabbitmq
    ports:
      - 18000:8000
    restart: 'no'
    volumes:
      - ./pas_gateway:/pas_gateway
      - ./celery_use:/pas_gateway/celery_use/
  django:
    build:
      context: ./
      dockerfile: Dockerfile.django
    command: python manage.py runserver 0.0.0.0:8001
    ports:
      - 18001:8001
    environment: *env
    depends_on:
      - rabbitmq
      - postgres
    volumes:
      - ./ops_interface:/code
  django-celery:
    build:
      context: ./
      dockerfile: Dockerfile.django
    command: bash -c "celery worker --app=celery_use.celery_config:app --concurrency=20 --queues=availability --loglevel=INFO"
    environment: *env
    depends_on:
      - rabbitmq
      - postgres
    volumes:
      - ./ops_interface:/code
      - ./celery_use:/code/project/celery_use
  postgres:
    image: postgres
    ports:
      - "15432:5432"
    volumes:
      - postgres-data:/var/lib/postgresql/data
  rabbitmq:
    image: rabbitmq:3.7.8

volumes:
  postgres-data:

dockerfile.django看起来像

FROM python:3
ENV PYTHONUNBUFFERED 1
RUN mkdir /code
WORKDIR /code
COPY ./requirements.txt /code/
RUN pip install -r requirements.txt
COPY ./ops_interface/ /code/
WORKDIR /code/project/

包括共享卷,我的django项目目录的结构就像

project
│   db.sqlite3
│   manage.py
│   __init__.py
│
├───avail
│   │   admin.py
│   │   apps.py
│   │   models.py
│   │   tests.py
│   │   views.py
│   │   __init__.py
│
├───celery_use
│   │   tests.py
│   │   views.py
│   │   __init__.py
│
└───project
    │   settings.py
    │   urls.py
    │   wsgi.py
    │   __init__.py

我能够从我的pas-gateway容器中放置一个队列:

from celery_use.tasks import process_availability
process_availability.s('testmessage').delay()

牢记,我的任务目前仅执行三件事:1(打印CWD 2(打印当前目录内容3(尝试从子目录"可用"导入这是django-celery

的输出
[2019-06-26 13:27:49,978: WARNING/ForkPoolWorker-16] /code/project
[2019-06-26 13:27:49,982: WARNING/ForkPoolWorker-16] ['avail', 'celery_use', 'db.sqlite3', 'manage.py', 'project', 'test_module.py', '__init__.py']
[2019-06-26 13:27:49,987: ERROR/ForkPoolWorker-16] Task process_availability[b4f312bd-4220-4c98-a874-6eded7a402b5] raised unexpected: ModuleNotFoundError("No module named 'avail'")
Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/celery/app/trace.py", line 382, in trace_task
    R = retval = fun(*args, **kwargs)
  File "/usr/local/lib/python3.7/site-packages/celery/app/trace.py", line 641, in __protected_call__
    return self.run(*args, **kwargs)
  File "/code/project/celery_use/tasks.py", line 14, in process_availability
    from avail.models import AvailabilityConfirmation
ModuleNotFoundError: No module named 'avail'

我不知道为什么没有模块'avail'

有时安装量时,底座芹菜存在一些问题。尝试删除您的卷:

  - ./ops_interface:/code
  - ./celery_use:/code/project/celery_use

并在需要的情况下用dockerfile中的副本代替它们。

还确保在django settings.py

中列出了'avail'

相关内容

  • 没有找到相关文章

最新更新