无法推断Kubeflow中kfserving组件托管的TensorFlow模型



嗨,所以我使用kfserving v.0.5.1组件来托管模型。我可以从s3下载和部署模型,但在尝试访问时遇到问题。

部署后,kfserving输出以下端点

http://recommendation-model.kubeflow-user-example-com.example.com

我无法从节点内外访问它。环顾四周后,我将入口网关从NodePort设置为LoadBalancer,并将ssli.io添加到knative服务的配置映射配置域

我遵循了knative dn config

<IPofMyLB>.sslip.io: ""

之后,我尝试推断模型,但没有从服务器得到错误或响应

curl -d '{"instances": ["abc"]}'   -X POST http://recommendation-model.kubeflow-user-example-com.<IPofMyLB>.sslip.io/v1/models/recommendation-model:predict

我试着简单地获得推理端点,输出是这样的:

curl -v -X GET http://recommendation-model.kubeflow-user-example-com.ELBIP.sslip.io
Note: Unnecessary use of -X or --request, GET is already inferred.
*   Trying ELBIP...
* TCP_NODELAY set
* Connected to recommendation-model.kubeflow-user-example-com.ELBIP.sslip.io (ELBIP) port 80 (#0)
> GET / HTTP/1.1
> Host: recommendation-model.kubeflow-user-example-com.ELBIP.sslip.io
> User-Agent: curl/7.64.1
> Accept: */*
>
< HTTP/1.1 302 Found
< content-type: text/html; charset=utf-8
< location: /dex/auth?client_id=kubeflow-oidc-authservice&redirect_uri=%2Flogin%2Foidc&response_type=code&scope=profile+email+groups+openid&state=MTYyMjY0MjU5M3xFd3dBRUV4MVVERkljREpRVUc1SVdXeDFaVkk9fOEnkjCWGNj6WPOgFhv2BUwNSKHsYyBR2kyj9_0geX2f
< date: Wed, 02 Jun 2021 14:03:13 GMT
< content-length: 269
< x-envoy-upstream-service-time: 1
< server: istio-envoy
<
<a href="/dex/auth?client_id=kubeflow-oidc-authservice&amp;redirect_uri=%2Flogin%2Foidc&amp;response_type=code&amp;scope=profile+email+groups+openid&amp;state=MTYyMjY0MjU5M3xFd3dBRUV4MVVERkljREpRVUc1SVdXeDFaVkk9fOEnkjCWGNj6WPOgFhv2BUwNSKHsYyBR2kyj9_0geX2f">Found</a>.
* Connection #0 to host recommendation-model.kubeflow-user-example-com.ELBIP.sslip.io left intact
* Closing connection 0
(base) ahsan@Ahsans-MacBook-Pro kfserving % curl -v -X GET http://recommendation-model.kubeflow-user-example-com.ELBIP.sslip.io
Note: Unnecessary use of -X or --request, GET is already inferred.
*   Trying ELBIP...
* TCP_NODELAY set
* Connected to recommendation-model.kubeflow-user-example-com.ELBIP.sslip.io (ELBIP) port 80 (#0)
> GET / HTTP/1.1
> Host: recommendation-model.kubeflow-user-example-com.ELBIP.sslip.io
> User-Agent: curl/7.64.1
> Accept: */*
>
< HTTP/1.1 302 Found
< content-type: text/html; charset=utf-8
< location: /dex/auth?client_id=kubeflow-oidc-authservice&redirect_uri=%2Flogin%2Foidc&response_type=code&scope=profile+email+groups+openid&state=MTYyMjY0MzE3OXxFd3dBRUhOdmFrSk9iVkU0Wms1VmMzVnhXbkU9fECJ3_U0SaWkR441eIWq-AJbFAV29-2Bk8uxPAOxPJD0
< date: Wed, 02 Jun 2021 14:12:59 GMT
< content-length: 269
< x-envoy-upstream-service-time: 1
< server: istio-envoy
<
<a href="/dex/auth?client_id=kubeflow-oidc-authservice&amp;redirect_uri=%2Flogin%2Foidc&amp;response_type=code&amp;scope=profile+email+groups+openid&amp;state=MTYyMjY0MzE3OXxFd3dBRUhOdmFrSk9iVkU0Wms1VmMzVnhXbkU9fECJ3_U0SaWkR441eIWq-AJbFAV29-2Bk8uxPAOxPJD0">Found</a>.
* Connection #0 to host recommendation-model.kubeflow-user-example-com.ELBIP.sslip.io left intact
* Closing connection 0

模型目录结构

recommendation_model/
└── 1
├── assets
├── keras_metadata.pb
├── saved_model.pb
└── variables
├── variables.data-00000-of-00001
└── variables.index

不确定如何让服务发挥作用,因为这是我的管道的最后一部分

通过LoadBalancer(本质上是istio-ingressgateway(访问推理服务时,与由Istio安全策略规定的NodePort相比,您的请求具有额外的控制层。

curl的响应消息表明您安装了带有DEX身份验证的Istio。

istio-dex指南提供了如何设置cookie来验证推理请求的示例。

相关内容

  • 没有找到相关文章

最新更新