Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

VRAM Usage: Is a 4090 sufficient? #2

Open
hylarucoder opened this issue Feb 18, 2025 · 4 comments
Open

VRAM Usage: Is a 4090 sufficient? #2

hylarucoder opened this issue Feb 18, 2025 · 4 comments

Comments

@hylarucoder
Copy link

hylarucoder commented Feb 18, 2025

VRAM Usage: Is a 4090 sufficient?

@heartInsert
Copy link

have you tried ?

@qiudi0127
Copy link
Collaborator

@hylarucoder Yes, you can try.

@hylarucoder
Copy link
Author

Thanks @qiudi0127 @heartInsert

I have a 4060 Ti with 16GB of VRAM. It seems like it would take a very long time to run. I tested it on an A100 with 40GB of VRAM in colab, and it required approximately 18GB.

Would you mind if I implemented a Colab version PR?

@hylarucoder
Copy link
Author

Submitted a pull request for Colab. @qiudi0127

#3

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants