从脚本中缩放Heroku dyno



我正在使用worker dyno运行一个spider,一切都正常。这很好,但每当蜘蛛完成时,Heroku就会感到困惑。它打印以下日志并再次启动应用程序,因为它认为应用程序崩溃了。

app[worker.1]: 2020-11-23 00:04:10 [scrapy.core.engine] INFO: Spider closed (finished)
heroku[worker.1]: Process exited with status 0
heroku[worker.1]: State changed from up to crashed
heroku[worker.1]: State changed from crashed to starting
heroku[worker.1]: Starting process with command `python realscraper.py`

那么,我该如何告诉Heroku dyno,当蜘蛛关闭时,它应该缩小到0?我不希望它崩溃并继续尝试重新启动。

通过修改本教程(其中有一些错误或贬值的东西(,我能够在管道的close_spider函数中创建一个scale函数。当我的蜘蛛关闭时,它会自动将dyno缩放到零,所以Heroku之后无法再次运行脚本。

APP = "herokuappname" ## your app name
KEY = "yourapikey" ## your Heroku API key
PROCESS = "worker" ## any type of dyno process should work here
HEADERS = {
'Accept': "application/vnd.heroku+json; version=3",
'Authorization': 'Bearer ' + KEY,
'Content-Type': 'application/json'
}
class MyPipeline:
## up here are the __init__ and process_item definitions, omitted for clarity
def close_spider(self, spider):
def scale(size):
payload = {'quantity': size} ## size is the number of dynos to scale to
json_payload = json.dumps(payload)
url = "https://api.heroku.com/apps/" + APP + "/formation/" + PROCESS
try:
result = requests.patch(url, headers=HEADERS, data=json_payload)
except:
print("Running scale function didn't work lol")
return None
if result.status_code == 200:
return "Success!"
else:
return "Failure"
print('Scaling out ... Closing app...')
print(scale(0))

return

显然,我并不是唯一一个寻求类似解决方案的人。Heroku文档没有足够清楚地表明你可以使用API从脚本中关闭dynos。

最新更新