GitXplorerGitXplorer
i

tensorflow-object-detection-app

public
8 stars
2 forks
0 issues

Commits

List of commits on branch master.
Unverified
ebafed5a9b5ec6705d5d87b3b47bf12901adf544

Add K8s GPU deployment instructions to README

iitamaro committed 7 years ago
Unverified
47a31058f2530e9892ac79afd5d7fcae6b7b659a

Add K8s deployment YAML for GPU variant

iitamaro committed 7 years ago
Unverified
5d35c8ba436c3ea68676af555768ee0680e57b51

Add K8s deployment instructions to README

iitamaro committed 7 years ago
Unverified
5b1bec24f11ba16145c8a4ba65d0bce56c1ec2ea

Add K8s deployment YAML

iitamaro committed 7 years ago
Unverified
569c61c277c3bf4fa762c41f4c0ad9a26ed6210d

Add prebuilt images to README

iitamaro committed 7 years ago
Unverified
e16a6050b3164f9e09fcd5825b6176af7e2f7f33

Another note about running in README

iitamaro committed 7 years ago

README

The README file for this repository.

TensorFlow Object Detection web app

Based on a demo app by GoogleCloudPlatform, using TensorFlow with pre-trained models to implement a dockerized general object detection service.

See also the Google Solution.

Build & Run

The default version (non-GPU):

docker build -t object-detection-app:latest .
docker run --rm -p 8000:8000 object-detection-app:latest

The GPU version (requires NVIDIA-Docker):

docker build -t object-detection-app:gpu .
docker run --runtime=nvidia --rm -p 8000:8000 object-detection-app:gpu

Once the container is up and running, access the app on localhost:8000 (replace localhost with the Docker Machine IP, if using Docker Machine).

Wait for something similar to the following lines:

2017-12-18 18:04:07.558019: I tensorflow/core/platform/cpu_feature_guard.cc:137] Your CPU supports instructions that this TensorFlow binary was not compiled to use: SSE4.1 SSE4.2 AVX AVX2 FMA
 * Running on http://0.0.0.0:8000/ (Press CTRL+C to quit)

Running Pre-built Images

To run pre-built images from Docker Hub:

docker run --rm -p 8000:8000 itamarost/object-detection-app:1.0-py3
# or, using nvidia-docker
docker run --runtime=nvidia --rm -p 8000:8000 itamarost/object-detection-app:1.0-py3-gpu

Deploy to Kubernetes

To run the app on Kubernetes (assuming configured kubectl):

kubectl apply -f k8s-deploy.yaml

To utilize a GPU, for Kubernetes clusters with available Nvidia GPU cards (alpha at the moment, may break due to Kubernetes API changes):

kubectl apply -f k8s-deploy-gpu.yaml

Feel free to tailor the YAML to your needs (deployed image, fronting service type, namespace, etc.).