我有一个名为algorithm.py的脚本,我希望能够在脚本期间调用Scrapy蜘蛛。文件结构为:
algorithm.pyMySpiders/
其中myspider是一个包含几个scrapy项目的文件夹。我想创建方法perform_spider1(), perform_spider2()…我可以在algorithm.py中调用它
如何构造这个方法?
我已经设法使用下面的代码调用一个蜘蛛,然而,它不是一个方法,它只适用于一个蜘蛛。我是一个需要帮助的初学者!
import sys,os.path
sys.path.append('path to spider1/spider1')
from twisted.internet import reactor
from scrapy.crawler import Crawler
from scrapy.settings import Settings
from scrapy import log, signals
from scrapy.xlib.pydispatch import dispatcher
from spider1.spiders.spider1_spider import Spider1Spider
def stop_reactor():
reactor.stop()
dispatcher.connect(stop_reactor, signal=signals.spider_closed)
spider = RaListSpider()
crawler = Crawler(Settings())
crawler.configure()
crawler.crawl(spider)
crawler.start()
log.start()
log.msg('Running reactor...')
reactor.run() # the script will block here
log.msg('Reactor stopped.')
通过调用configure
, crawl
和start
来设置您的蜘蛛,然后调用log.start()
和reactor.run()
。scrapy会在同一个进程中运行多个蜘蛛。
更多信息请看文档和这个主题。
另外,考虑通过scrapyd运行您的蜘蛛。
希望对你有帮助。
根据alecxe的建议,这里有一个可能的解决方案。
import sys,os.path
sys.path.append('/path/ra_list/')
sys.path.append('/path/ra_event/')
from twisted.internet import reactor
from scrapy.crawler import Crawler
from scrapy.settings import Settings
from scrapy import log, signals
from scrapy.xlib.pydispatch import dispatcher
from ra_list.spiders.ra_list_spider import RaListSpider
from ra_event.spiders.ra_event_spider import RaEventSpider
spider_count = 0
number_of_spiders = 2
def stop_reactor_after_all_spiders():
global spider_count
spider_count = spider_count + 1
if spider_count == number_of_spiders:
reactor.stop()
dispatcher.connect(stop_reactor_after_all_spiders, signal=signals.spider_closed)
def crawl_resident_advisor():
global spider_count
spider_count = 0
crawler = Crawler(Settings())
crawler.configure()
crawler.crawl(RaListSpider())
crawler.start()
crawler = Crawler(Settings())
crawler.configure()
crawler.crawl(RaEventSpider())
crawler.start()
log.start()
log.msg('Running in reactor...')
reactor.run() # the script will block here
log.msg('Reactor stopped.')