如何提高 Google App Engine 的隐藏"file.Open"配额



具体来说,这个问题是关于如何提高或取消指定的配额,而不是如何在现有配额限制内提高效率。

在GAE上运行MapReduce作业时,我达到了下面列出的配额限制。 限制是每天 100GB 的"接收的文件字节数",据我所知,这是从 Blobstore 接收的文件字节数。增加预算不会影响 100Gb/天的配额限制。我希望完全取消限制,并有能力为我使用的东西付费。

日志中的输出:

The API call file.Open() required more quota than is available.
Traceback (most recent call last):
  File "/base/python27_runtime/python27_lib/versions/third_party/webapp2-2.3/webapp2.py", line 1511, in __call__
    rv = self.handle_exception(request, response, e)
  File "/base/python27_runtime/python27_lib/versions/third_party/webapp2-2.3/webapp2.py", line 1505, in __call__
    rv = self.router.dispatch(request, response)
  File "/base/python27_runtime/python27_lib/versions/third_party/webapp2-2.3/webapp2.py", line 1253, in default_dispatcher
    return route.handler_adapter(request, response)
  File "/base/python27_runtime/python27_lib/versions/third_party/webapp2-2.3/webapp2.py", line 1077, in __call__
    return handler.dispatch()
  File "/base/python27_runtime/python27_lib/versions/third_party/webapp2-2.3/webapp2.py", line 547, in dispatch
    return self.handle_exception(e, self.app.debug)
  File "/base/python27_runtime/python27_lib/versions/third_party/webapp2-2.3/webapp2.py", line 545, in dispatch
    return method(*args, **kwargs)
  File "/base/data/home/apps/s~utest-appgraph/69.358421800203055451/mapreduce/base_handler.py", line 68, in post
    self.handle()
  File "/base/data/home/apps/s~utest-appgraph/69.358421800203055451/mapreduce/handlers.py", line 168, in handle
    for entity in input_reader:
  File "/base/data/home/apps/s~utest-appgraph/69.358421800203055451/mapreduce/mapreduce_pipeline.py", line 109, in __iter__
    for binary_record in super(_ReducerReader, self).__iter__():
  File "/base/data/home/apps/s~utest-appgraph/69.358421800203055451/mapreduce/input_readers.py", line 1615, in __iter__
    record = self._reader.read()
  File "/base/data/home/apps/s~utest-appgraph/69.358421800203055451/mapreduce/lib/files/records.py", line 335, in read
    (chunk, record_type) = self.__try_read_record()
  File "/base/data/home/apps/s~utest-appgraph/69.358421800203055451/mapreduce/lib/files/records.py", line 292, in __try_read_record
    header = self.__reader.read(HEADER_LENGTH)
  File "/base/data/home/apps/s~utest-appgraph/69.358421800203055451/mapreduce/lib/files/file.py", line 569, in read
    with open(self._filename, 'r') as f:
  File "/base/data/home/apps/s~utest-appgraph/69.358421800203055451/mapreduce/lib/files/file.py", line 436, in open
    exclusive_lock=exclusive_lock)
  File "/base/data/home/apps/s~utest-appgraph/69.358421800203055451/mapreduce/lib/files/file.py", line 269, in __init__
    self._open()
  File "/base/data/home/apps/s~utest-appgraph/69.358421800203055451/mapreduce/lib/files/file.py", line 393, in _open
    self._make_rpc_call_with_retry('Open', request, response)
  File "/base/data/home/apps/s~utest-appgraph/69.358421800203055451/mapreduce/lib/files/file.py", line 397, in _make_rpc_call_with_retry
    _make_call(method, request, response)
  File "/base/data/home/apps/s~utest-appgraph/69.358421800203055451/mapreduce/lib/files/file.py", line 243, in _make_call
    rpc.check_success()
  File "/base/python27_runtime/python27_lib/versions/1/google/appengine/api/apiproxy_stub_map.py", line 558, in check_success
    self.__rpc.CheckSuccess()
  File "/base/python27_runtime/python27_lib/versions/1/google/appengine/api/apiproxy_rpc.py", line 133, in CheckSuccess
    raise self.exception
OverQuotaError: The API call file.Open() required more quota than is available.

您似乎需要直接与Google交谈:在配额页面上,有一个指向请求增加配额的表单的链接:http://support.google.com/code/bin/request.py?&contact_type=AppEngineCPURequest

我也遇到了这个错误。我们正在使用 appengine 的"实验备份"功能。这反过来又运行一个地图缩减,将所有appengine数据备份到google-cloud-storage。但是,当前备份失败并显示此错误:

超配额错误:API 调用文件。Open() 需要比可用配额更多的配额。

在配额仪表板中,我们看到:

Other Quotas With Warnings
These quotas are only shown when they have warnings
File Bytes Sent 100%    107,374,182,400 of 107,374,182,400  Limited

因此,显然我们击中了一个隐藏的配额"已发送的文件字节数"。但它没有记录在任何地方,我们可能永远不会知道我们会击中它......现在我们被困住了

相关内容

  • 没有找到相关文章

最新更新