You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hey there podman-py community! I've been working on a personal project that tries to create containers with podman-py. I'm not sure if it's possible to spin containers up with podman-py that mounts in my GPU successfully. I'd love some assistance with this or an answer if it's possible to do what I'm trying to do. My environment is a Red Hat Enterprise Linux WSL2 distro. I'm currently using an NVIDIA RTX 3080. I can run the following command from the WSL2 distro, and the container runs with the GPU as expected.
I'm running another container where I mount in my WSL2's rootless podman socket for the podman-py client to access it. This other container is a web-ui that I want to manage another container in this project. I'd like to be able to use podman-py to build, create, start, stop, and remove containers from inside this container. I have the logic working, but when I create the podman containers with podman-py, I'm not sure how to get it to create it with GPU access. Here's my function that creates a container in this web-ui.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Hey there podman-py community! I've been working on a personal project that tries to create containers with podman-py. I'm not sure if it's possible to spin containers up with podman-py that mounts in my GPU successfully. I'd love some assistance with this or an answer if it's possible to do what I'm trying to do. My environment is a Red Hat Enterprise Linux WSL2 distro. I'm currently using an NVIDIA RTX 3080. I can run the following command from the WSL2 distro, and the container runs with the GPU as expected.
podman run -d -e "PULSE_SERVER=${PULSE_SERVER}" \ -v /mnt/wslg/:/mnt/wslg/ \ -v /home/codeuh/dev/hue-ai/voice/OpenVoice:/app/src \ -v /home/codeuh/dev/hue-ai/shared/tts/:/app/tts \ --gpus=all \ --name my-voice \ my-voice
I'm running another container where I mount in my WSL2's rootless podman socket for the podman-py client to access it. This other container is a web-ui that I want to manage another container in this project. I'd like to be able to use podman-py to build, create, start, stop, and remove containers from inside this container. I have the logic working, but when I create the podman containers with podman-py, I'm not sure how to get it to create it with GPU access. Here's my function that creates a container in this web-ui.
Does anyone know if there's a way to do what --gpus=all is doing with the podman-py client in the environment I described?
Thanks for your time and consideration,
codeuh
Beta Was this translation helpful? Give feedback.
All reactions