1.Build Docker Image
Because you always have problems with your build image, here is a temporary lease on a mirror on Dockerhub docker.io/mochin/tensorflow-serving
Push this image to the Docker registry of the K8s cluster
2. Writing Yaml
In the official example, a yaml is given, but some places are wrong, or the dockerimage is not applicable (probably because of the 0.4.0 version)
Made some changes.
Apiversion:extensions/v1beta1kind:deploymentmetadata: name:inception-deploymentspec: replicas:2 Template: metadata: Labels: app:inception-server Spec: containers: -Name: Inception-container image:registry.lenovo.com:8080/tensorflow-serving:0.4.0 command: -/bin/sh -C- args: -serving/bazel-bin/tensorflow_serving/example/inception_inference--port=9900 serving/ Inception-export Ports: -containerport:9900---apiVersion:v1kind:Servicemetadata: Labels: Run:inception-service Name:inception-servicespec: ports: -port:9900 targetport:9900 Selector: app:inception-server #type: LoadBalancer externalips: -xxx.xxx.xxx.xxx
3. Submit to K8s
Kubectl create-f Xxx.yaml
4. View status
#>kubectl Get Service
NAME cluster-ip external-ip PORT (S) Age
Inception-service xxx.xxx.xxx.xxx xxx.xxx.xxx.xxx 9900/tcp 8m
#>kubectl Get Deployment
NAME desired current up-to-date AVAILABLE age
Inception-deployment 2 2 2 2 3h
5. Conduct the test
Run a Docker container to access the deployed service
Docker Run--rm-it Tensorflow-serving:latest/bin/bash
#>cd Serving
#>./bazel-bin/tensorflow_serving/example/inception_client--server=10.100.208.54:9900--image=./tensorflow/ Tensorflow/examples/label_image/data/grace_hopper.jpg &> Log &
7.863746:military Uniform
6.911478:bow tie, Bow-tie, bowtie
6.642433:mortarboard
5.758826:suit, suit of clothes
5.614463:academic gown, academic robe, judge ' s robe
6. Load Balancing Configuration
Todo
TensorFlow Serving with Kubernetes