Http proxy for Tensorflow image classification model that can used as microservice in your application. Classification model that you will pass to container should be based on Google inception model. For more information you can read docs or watch guide.
docker run -d -p 80:80 -v <graph-path>:/project/graph dimorinny/tensorflow-image-classificator
Graph path should contains:
- Protobuf graph file
retrained_graph.pb
- Labels text file
retrained_labels.txt
Environment parameters:
- PROCESS_POOL_SIZE - Count of worker processes for recognition (by default using count of cpus)
- TENSORFLOW_MODEL_PATH - Path to graph and text file with recognition labels (by default /project/graph)
- PORT - Http proxy port (80 by default)
For image classification you should execute GET
request with image url param like this:
http://127.0.0.1:8080/api/v1/recognize?image=http://i.imgur.com/yAWdJ9b.jpg
Also you can send image using POST
request with image
form-data param using same url.
After that server returns recognition result for every labels that contains in your graph. For example after success request you got response like this:
{
"status": "success",
"response": {
"bad food": 0.23140761256217957,
"good food": 0.7685924172401428
}
}
When some error occured during recognition process you got reponse like this:
{
"status": "error"
}
For more details you should look at stderr.