Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Optmised for 6 GB? #8

Open
AndrWeisR opened this issue Sep 11, 2022 · 1 comment
Open

Optmised for 6 GB? #8

AndrWeisR opened this issue Sep 11, 2022 · 1 comment

Comments

@AndrWeisR
Copy link

AndrWeisR commented Sep 11, 2022

Can this be optimised to run in 6 GB similarly to Hlky's https://github.com/sd-webui/stable-diffusion-webui/tree/master/optimizedSD ?

I'm getting CUDA out of memory when using prompt and edit_prompt in 6 GB @ 512x512. Generation of a single 512x512 image is working.

@bloc97
Copy link
Owner

bloc97 commented Sep 11, 2022

I will need to see whether this can be optimized for 6GB, as we need to save the attention maps, which is pretty big for the lower parts of the u-net (8x4096x4096). Maybe a workaround would be to save them to RAM and load them one by one during inference.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants