How to Run a Model on a GPU with bentoml in k8s? #4279
Unanswered
2232729885
asked this question in
General
Replies: 1 comment 1 reply
-
Did you include |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
below is my bentofile.yaml:
'service: "service.py:svc"
include:
python:
index_url: https://pypi.tuna.tsinghua.edu.cn/simple
requirements_txt: "requirements.txt"
docker:
python_version: "3.8"
cuda_version: "11.7.1"'
after bentoml build and containerize, i start a pod in k8s. I can run cmd 'nvidia-smi' in pod, the result is:
![image](https://private-user-images.githubusercontent.com/138634190/282701642-5aa29285-c92b-4f43-bd9a-aff51290111b.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MzkzMzg3MjUsIm5iZiI6MTczOTMzODQyNSwicGF0aCI6Ii8xMzg2MzQxOTAvMjgyNzAxNjQyLTVhYTI5Mjg1LWM5MmItNGY0My1iZDlhLWFmZjUxMjkwMTExYi5wbmc_WC1BbXotQWxnb3JpdGhtPUFXUzQtSE1BQy1TSEEyNTYmWC1BbXotQ3JlZGVudGlhbD1BS0lBVkNPRFlMU0E1M1BRSzRaQSUyRjIwMjUwMjEyJTJGdXMtZWFzdC0xJTJGczMlMkZhd3M0X3JlcXVlc3QmWC1BbXotRGF0ZT0yMDI1MDIxMlQwNTMzNDVaJlgtQW16LUV4cGlyZXM9MzAwJlgtQW16LVNpZ25hdHVyZT0zMjViMTJkNzA2ODM1MDY0NDU2ZmJmYTUwNDAwN2QyYzcxODA3MjhiZmQwN2MyMWFmN2UxZmI2ZTc2YTFlMGFlJlgtQW16LVNpZ25lZEhlYWRlcnM9aG9zdCJ9.wbWAU2EzEKkH_fb8dbyaBH_GKabxCwe7mxf05g7ibY8)
but no process run on gpu? so what i should do ?
Beta Was this translation helpful? Give feedback.
All reactions