This project corresponds to a Kaggle image classification contest organized by DataTalks.Club.
In this competition we need to classify images of different kitchenware items into 6 classes:
- cups
- glasses
- plates
- spoons
- forks
- knives
I experimented with different deployment ways for this model.
Telegram Bot for Kitchenware Classification Model
docker pull maryorihuela/kitchenware-classification:gdvanld5rg76jrft
docker run -it --rm -p 3000:3000 maryorihuela/kitchenware-classification:gdvanld5rg76jrft serve --production
- Just go to localhost:3000 and test it by uploading an image.
- Video of the model running locally with BentoML:
- Download docker and run it:
docker pull maryorihuela/kitchenware-model:latest
docker run -it --rm -p 8080:8080 maryorihuela/kitchenware-model:latest
- Download test.py
- Run:
python3 test.py
- For testing diferent images use a url after the fine name e.g.:
python3 test.py https://raw.githubusercontent.com/mary435/kitchenware_classification/main/images/0040.jpg
- I uploaded some images in my repository to test, but you can put any url.
https://raw.githubusercontent.com/mary435/kitchenware_classification/main/images/0017.jpg
https://raw.githubusercontent.com/mary435/kitchenware_classification/main/images/0022.jpg
https://raw.githubusercontent.com/mary435/kitchenware_classification/main/images/0034.jpg
https://raw.githubusercontent.com/mary435/kitchenware_classification/main/images/0036.jpg
https://raw.githubusercontent.com/mary435/kitchenware_classification/main/images/0040.jpg
https://raw.githubusercontent.com/mary435/kitchenware_classification/main/images/6172.jpg
- Download from Kaggle contest.
- API from Kaggle:
kaggle competitions download -c kitchenware-classification
The notebook was created with this anaconda environment: model.yaml
Download it and import it to your anaconda, option environments, import.
Next, open the jupyter Notebook file and run it to view the EDA analyzes, training of differents models, selection process and parameter tuning.
Script to training the final model and save it. To run this script in addition to the dataset saved at the same folder, you need the environment:
- Anaconda: model.yaml
python3 test.py
- Pipenv: Pipfile and Pipfile.lock
pipenv python3 test.py
-
To develop it need the kitchenware-model.h5 from train.py. Download and run this script keras_to_tflite.py to save the model 'kitchenware-model.h5' to a lambda model file: kitchenware-model.tflite.
Run:python3 keras_to_tflite.py
ORpipenv python3 keras_to_tflite.py
-
Download this files:
-
Run this command:
docker build -t kitchenware-model .
docker run -it --rm -p 8080:8080 kitchenware-model:latest
-
To try it locally download this file: test.py And run
python3 test.py
ORpipenv python3 test.py
- To develop it need the kitchenware-model.h5 from train.py.
- Convert the model:
ipython
import tensorflow as tf
from tensorflow import keras
model = keras.models.load_model('kitchenware-model.h5')
tf.saved_model.save(model, 'kitchenware-model')
exit ipython
saved_model_cli show --dir kitchenware-model --all
- Find and copy the second signature: signature_def to model-description.txt
signature_def['serving_default']:
The given SavedModel SignatureDef contains the following input(s):
inputs['input_45'] tensor_info:
dtype: DT_FLOAT
shape: (-1, 299, 299, 3)
name: serving_default_input_45:0
The given SavedModel SignatureDef contains the following output(s):
outputs['dense_35'] tensor_info:
dtype: DT_FLOAT
shape: (-1, 6)
name: StatefulPartitionedCall:0
Method name is: tensorflow/serving/predict
- Save this values:
serving_default
input_45 - input
dense_35 - output
- Run the model:
docker run -it --rm -p 8500:8500 -v "$(pwd)/kitchenware-model:/models/kitchenware-model/1" -e MODEL_NAME="kitchenware-model" tensorflow/serving:2.7.0
Now will see a message like: "[evhttp_server.cc : 245] NET_LOG: Entering the event loop ..."
-
tf-serving-connect: open tf-serving-connect.ipynb and run it to test the running model.
-
Run:
jupiter nbconvert --tosript tf-serving-connect.ipynb
and clear the file to run as script with:python3 tf-serving-connect.py
-
Convert this script to a Flask app: Add the flask configration to the tf-serving-connect.py and save it to gateway.py or download the following files already configured.
- gateway.py.
- test.py.
- proto.py.
Test it running python3 gateway.py
.
Now that gateway is running with flask, in another window: python3 test.py
.
The model answers the most probable class.
- Prepare the environment with pipenv:
pipenv --python 3.9
pipenv install grpcio ==1.42.0 flask gunicorn keras-image-helper tensorflow-protobuf==2.11.0
- Or download from here: Pipfile Pipfile.lock And run
pipenv install
- Download the file: image-model.dockerfile. And run:
docker build -t kitchenware-model:xception-v4-001 -f image-model.dockerfile .
docker run -it --rm -p 8500:8500 kitchenware-model:xception-v4-001
- Or download and run:
docker push maryorihuela/kitchenware-model:xception-v4-001
docker run -it --rm -p 8500:8500 maryorihuela/kitchenware-model:xception-v4-001
-
For testing comment the line
app.run(debug=True, host='0.0.0.0', port=9696)
on gateway.py. And runpipenv run python3 gateway.py
. -
Now uncomment the line
app.run(debug=True, host='0.0.0.0', port=9696)
on gateway.py. And comment the first tree:
url = 'https://raw.githubusercontent.com/mary435/kitchenware_classification/main/images/6172.jpg'
response = predict(url)
print(response)
- Download the file: image-gateway.dockerfile And run:
docker build -t kitchenware-gateway:001 -f image-gateway.dockerfile .
docker run -it --rm -p 9696:9696 kitchenware-gateway:001
- Or download and run:
docker push maryorihuela/kitchenware-gateway:001
docker run -it --rm -p 8500:8500 maryorihuela/kitchenware-gateway:001
- Download docker compose file: docker-compose.yaml
- Run:
docker-compose up
- Test:
python3 test.py
- Option detached mode:
docker-compose up -d
And Off:docker-compose down
- Run:
-
Install kubectl: search on google "kubectl AWS" and install from the link instructions. Same for "kind" and follow the instructions for your OS.
-
New folder: kube-config: Download the file model-deployment.yaml
kind load docker-image kitchenware-model:xception-v4-001
cd kube-config/
kubectl apply -f model-deployment.yaml
kubectl get pod
kubectl port-forward tf-serving-kitchenware-model-#add_here_the_id# 8500:8500
-
Testing: comment the line
app.run(debug=True, host='0.0.0.0', port=9696)
on gateway.py, and uncomment the other tree lines. Runpipenv run python3 gateway.py
. -
Download the file: model-service.yaml
kubectl apply -f model-service.yaml
kubectl get service
kubectl port-forward service/tf-serving-kitchenware-model 8500:8500
-
Test
pipenv run python3 gateway.py
. -
Download the file: gateway-deployment.yaml
kind load docker-image kitchenware-gateway:001
kubectl get pod
kubectl apply -f gateway-deployment.yaml
kubectl get pod
kubectl port-forward gateway-#add_here_the_id# 9696:9696
-
Test
python3 test.py
-
Download the file: gateway-service.yaml
kubectl apply -f gateway-service.yaml
kubectl get service
kubectl port-forward service/gateway 8080:80
- Test.py change the url to 8080
python3 test.py
Deploying to EKS: AWS-EKS-configuration.md:
Distributed under the terms of the MIT license, "kitchenware_classification" is free and open source software.