tensorflow_model服务器不能在协作中工作



在这里运行教程时,我运行

tensorflow_model_server 
--rest_api_port=8501 
--model_name=tfrbert 
--model_base_path="/content/drive/MyDrive/app/model1/export/latest_model/"

colab中,但运行时间较长,输出为:

2021-08-06 14:08:56.175079: I tensorflow_serving/model_servers/server.cc:89] Building single TensorFlow model file config:  model_name: tfrbert model_base_path: /content/drive/MyDrive/app/model1/export/latest_model/
2021-08-06 14:08:56.175256: I tensorflow_serving/model_servers/server_core.cc:465] Adding/updating models.
2021-08-06 14:08:56.175287: I tensorflow_serving/model_servers/server_core.cc:591]  (Re-)adding model: tfrbert
2021-08-06 14:08:56.282059: I tensorflow_serving/core/basic_manager.cc:740] Successfully reserved resources to load servable {name: tfrbert version: 1628175000}
2021-08-06 14:08:56.282118: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: tfrbert version: 1628175000}
2021-08-06 14:08:56.282136: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: tfrbert version: 1628175000}
2021-08-06 14:08:56.282741: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:38] Reading SavedModel from: /content/drive/MyDrive/app/model1/export/latest_model/1628175000
2021-08-06 14:08:56.307745: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:90] Reading meta graph with tags { serve }
2021-08-06 14:08:56.307841: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:132] Reading SavedModel debug info (if present) from: /content/drive/MyDrive/app/model1/export/latest_model/1628175000
2021-08-06 14:08:56.308630: I external/org_tensorflow/tensorflow/core/platform/cpu_feature_guard.cc:142] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations:  AVX2 FMA
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
2021-08-06 14:08:56.511612: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:206] Restoring SavedModel bundle.
2021-08-06 14:08:56.539804: I external/org_tensorflow/tensorflow/core/platform/profile_utils/cpu_utils.cc:114] CPU Frequency: 2299995000 Hz
2021-08-06 14:08:57.499446: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:190] Running initialization op on SavedModel bundle at path: /content/drive/MyDrive/app/model1/export/latest_model/1628175000
2021-08-06 14:08:57.567048: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:277] SavedModel load for tags { serve }; Status: success: OK. Took 1284304 microseconds.
2021-08-06 14:08:57.577930: I tensorflow_serving/servables/tensorflow/saved_model_warmup_util.cc:59] No warmup data file found at /content/drive/MyDrive/app/model1/export/latest_model/1628175000/assets.extra/tf_serving_warmup_requests
2021-08-06 14:08:57.582683: I tensorflow_serving/core/loader_harness.cc:87] Successfully loaded servable version {name: tfrbert version: 1628175000}
2021-08-06 14:08:57.583905: I tensorflow_serving/model_servers/server_core.cc:486] Finished adding/updating models
2021-08-06 14:08:57.583970: I tensorflow_serving/model_servers/server.cc:367] Profiler service is enabled
2021-08-06 14:08:57.584549: I tensorflow_serving/model_servers/server.cc:393] Running gRPC ModelServer at 0.0.0.0:8500 ...
[warn] getaddrinfo: address family for nodename not supported
[evhttp_server.cc : 245] NET_LOG: Entering the event loop ...
2021-08-06 14:08:57.585146: I tensorflow_serving/model_servers/server.cc:414] Exporting HTTP/REST API at:localhost:8501 ...

有什么问题?

我看到了您提供的教程链接。它考虑将tensorflow_model_server作为bash shell脚本运行。然而,你可以在Google Colab中运行它,但你可以用一些工具修饰这个命令,以获得像其他python常规代码一样的行为。

tensorflow_model_server运行一个web服务器服务,它不会立即结束,因为它正在等待请求并给您响应。如果你想让它在后台运行,并返回UI以便运行其他单元格,请像这样修改代码:

%%bash --bg 
nohup tensorflow_model_server 
--rest_api_port=8501
--model_name=tfrbert
--model_base_path="path/to/dir" > server.log 2>&1

变更说明:

  • %%bash --bg:是一个jupyter魔法命令,它告诉Colab它是一个bash脚本,目的是在后台运行
  • nohup: &;no hang up&;的缩写,让我们运行命令immune来挂断。
  • > server.log 2&1:将命令输出保存在server.log文件中,而不是打印到stdout。(你在问题中添加的输出将不再在UI中可见,但它们将被打印在文件中)

相关内容

  • 没有找到相关文章

最新更新