芹菜在使用Scrapy-djangoItem时无法启动工艺



问题详细信息:https://github.com/celery/celery/celery/issues/3598

我想经营一个带有芹菜的废品蜘蛛,其中包含djangoitems。

这是我的芹菜任务:

# coding_task.py
import sys
from celery import Celery
from collector.collector.crawl_agent import crawl
app = Celery('coding.net', backend='redis', broker='redis://localhost:6379/0')
app.config_from_object('celery_config')

@app.task
def period_task():
    crawl()

collector.collector.crawl_agent.crawl包含一个使用djangoitem作为项目的擦拭携带者。类似的项目:

import django
os.environ['DJANGO_SETTINGS_MODULE'] = 'RaPo3.settings'
django.setup()
from scrapy_djangoitem import DjangoItem
from xxx.models import Collection

class CodingItem(DjangoItem):
    django_model = Collection
    amount = scrapy.Field(default=0)
    role = scrapy.Field()
    type = scrapy.Field()
    duration = scrapy.Field()
    detail = scrapy.Field()
    extra = scrapy.Field()

运行时:celery -A coding_task worker --loglevel=info --concurrency=1时,它将在下面得到一些错误:

[2016-11-16 17:33:41,934: ERROR/Worker-1] Process Worker-1
Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/billiard/process.py", line 292, in _bootstrap
    self.run()
  File "/usr/local/lib/python2.7/site-packages/billiard/pool.py", line 292, in run
    self.after_fork()
  File "/usr/local/lib/python2.7/site-packages/billiard/pool.py", line 395, in after_fork
    self.initializer(*self.initargs)
  File "/usr/local/lib/python2.7/site-packages/celery/concurrency/prefork.py", line 80, in process_initializer
    signals.worker_process_init.send(sender=None)
  File "/usr/local/lib/python2.7/site-packages/celery/utils/dispatch/signal.py", line 151, in send
    response = receiver(signal=self, sender=sender, **named)
  File "/usr/local/lib/python2.7/site-packages/celery/fixups/django.py", line 152, in on_worker_process_init
    self._close_database()
  File "/usr/local/lib/python2.7/site-packages/celery/fixups/django.py", line 181, in _close_database
    funs = [self._db.close_connection]  # pre multidb
AttributeError: 'module' object has no attribute 'close_connection'
[2016-11-16 17:33:41,942: INFO/MainProcess] Connected to redis://localhost:6379/0
[2016-11-16 17:33:41,957: INFO/MainProcess] mingle: searching for neighbors
[2016-11-16 17:33:42,962: INFO/MainProcess] mingle: all alone
/usr/local/lib/python2.7/site-packages/celery/fixups/django.py:199: UserWarning: Using settings.DEBUG leads to a memory leak, never use this setting in production environments!
  warnings.warn('Using settings.DEBUG leads to a memory leak, never '
[2016-11-16 17:33:42,968: WARNING/MainProcess] /usr/local/lib/python2.7/site-packages/celery/fixups/django.py:199: UserWarning: Using settings.DEBUG leads to a memory leak, never use this setting in production environments!
  warnings.warn('Using settings.DEBUG leads to a memory leak, never '
[2016-11-16 17:33:42,968: WARNING/MainProcess] celery@MacBook-Pro.local ready.
[2016-11-16 17:33:42,969: ERROR/MainProcess] Process 'Worker-1' pid:2777 exited with 'exitcode 1'
[2016-11-16 17:33:42,991: ERROR/MainProcess] Unrecoverable error: WorkerLostError('Could not start worker processes',)
Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/celery/worker/__init__.py", line 208, in start
    self.blueprint.start(self)
  File "/usr/local/lib/python2.7/site-packages/celery/bootsteps.py", line 127, in start
    step.start(parent)
  File "/usr/local/lib/python2.7/site-packages/celery/bootsteps.py", line 378, in start
    return self.obj.start()
  File "/usr/local/lib/python2.7/site-packages/celery/worker/consumer.py", line 271, in start
    blueprint.start(self)
  File "/usr/local/lib/python2.7/site-packages/celery/bootsteps.py", line 127, in start
    step.start(parent)
  File "/usr/local/lib/python2.7/site-packages/celery/worker/consumer.py", line 766, in start
    c.loop(*c.loop_args())
  File "/usr/local/lib/python2.7/site-packages/celery/worker/loops.py", line 50, in asynloop
    raise WorkerLostError('Could not start worker processes')
WorkerLostError: Could not start worker processes

如果我在项目中删除djangoitem:

from scrapy.item import Item
class CodingItem(item):
    amount = scrapy.Field(default=0)
    role = scrapy.Field()
    type = scrapy.Field()
    duration = scrapy.Field()
    detail = scrapy.Field()
    extra = scrapy.Field()

该任务将效果很好,没有任何错误。如果我想在这项芹菜疗法任务中使用djangoitem,该怎么办?

谢谢!

您应该检查RAM用法。芹菜可能不会得到足够的RAM

将芹菜升级到4.0将解决问题。

更多详细信息:https://github.com/celery/celery/issues/3598

相关内容

  • 没有找到相关文章

最新更新