-
Notifications
You must be signed in to change notification settings - Fork 14
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
python main_chat.py <> CUDA out of memory #5
Comments
Same here, I was using a single 3090 GPU, and met the OOM error with Edit: |
For the problem of a single 3090 memory explosion, can I use two 3090s in parallel training? It appears that the source code does not provide parallel training? |
I successfully deployed BF16 (with a display of 26GB of VRAM), but after encountering an issue with the UseSR interaction input, the VRAM was running low. I am using 40GB. What is the problem? |
I had the same problem, again because it wasn't big enough. I re-used the A800 (80G) to run successfully and achieve communication and pose generation! |
Hello,
main_chat.py
with bf16 precision I get an OOM error. I am using a 24GB GPU. Is this expected? Can't find info about the minimal gpu requirement.AttributeError: 'LlamaAttention' object has no attribute 'rope_theta'
. I think this is related todeepspeed
, which was not listed in the requirements, so should I install a specific version?Thanks!
The text was updated successfully, but these errors were encountered: