RTX 3090 : cudaMallocAsync #2994
Replies: 3 comments
-
im not pro and maybe i will say a crap but i think everythink is ok and just so everything is ok .... according to me ... but im not programmer and all people here are smarter than me |
Beta Was this translation helpful? Give feedback.
0 replies
-
Same here |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Cuda not accepted. Device: cuda:0
Loading Log:
Set vram state to: HIGH_VRAM
Device: cuda:0 NVIDIA GeForce RTX 3090 : cudaMallocAsync
VAE dtype: torch.bfloat16
Using pytorch cross attention
Cuda not accepted. Device: cuda:0
How can I solve this problem ?
I guess acceleration is not working correctly?
Beta Was this translation helpful? Give feedback.
All reactions