如何捕捉MySQL数据库中保存项目时发生的零碎异常



我每24小时就要在scratchy中运行一次蜘蛛。从spider中抓取的项目存储在MySQL数据库中。为了只收集唯一的项目,我设置了表结构来作废重复的记录。因此,在大多数情况下都会出现重复输入错误。我需要捕捉所有这些&防止它们被打印在控制台/终端上。以下是错误的快照。

2020-08-27 07:02:39 [scrapy.core.scraper] ERROR: Error processing {'jobtitle': ['E-Learning Specialist'],
'joburl': ['https://******/e-learning-specialist-1530588']}
Traceback (most recent call last):
File "/usr/lib/python3/dist-packages/twisted/internet/defer.py", line 654, in _runCallbacks
current.result = callback(current.result, *args, **kw)
File "/home/scrapym0/scrapy/BotV0.1/test2/test2/pipelines.py", line 85, in process_item
self.store_db(item)
File "/home/scrapym0/scrapy/BotV0.1/test2/test2/pipelines.py", line 100, in store_db
self.curr.execute("""INSERT INTO JobList(Job_Title,Job_URL,entry_date) VALUES(%s, %s, %s)""", (
File "/usr/lib/python3/dist-packages/mysql/connector/cursor.py", line 569, in execute
self._handle_result(self._connection.cmd_query(stmt))
File "/usr/lib/python3/dist-packages/mysql/connector/connection.py", line 553, in cmd_query
result = self._handle_result(self._send_cmd(ServerCmd.QUERY, query))
File "/usr/lib/python3/dist-packages/mysql/connector/connection.py", line 442, in _handle_result
raise errors.get_exception(packet)
mysql.connector.errors.IntegrityError: 1062 (23000): Duplicate entry 'https://******/e-learning-specialist-1530588' for key 'JobList.Job_URL_UNIQUE'

Try/except块按预期工作

try:
self.curr.execute("SQL_statement")
except mysql.connector.errors.IntegrityError:
pass

相关内容

最新更新