Skip to content
This repository has been archived by the owner on Nov 9, 2023. It is now read-only.

RuntimeError: mat1 and mat2 must have the same dtype #11

Open
Pedrodilamuerte opened this issue Aug 1, 2023 · 10 comments
Open

RuntimeError: mat1 and mat2 must have the same dtype #11

Pedrodilamuerte opened this issue Aug 1, 2023 · 10 comments

Comments

@Pedrodilamuerte
Copy link

Hello,

I'm trying to use the extension but I get this error when the refiner kicks in. Any idea ?

RuntimeError: mat1 and mat2 must have the same dtype

Best regards !

@dhwz
Copy link
Contributor

dhwz commented Aug 1, 2023

Try removing --no-half or --no-half-vae from cmdline args

@Pedrodilamuerte
Copy link
Author

yep it works but I get black images ^^'

@dhwz
Copy link
Contributor

dhwz commented Aug 2, 2023

yep it works but I get black images ^^'

You're on AMD hardware?

@amimi818
Copy link

amimi818 commented Aug 2, 2023

yep it works but I get black images ^^'

Remove --disable-nan-check,and try again

@Pedrodilamuerte
Copy link
Author

yep it works but I get black images ^^'

You're on AMD hardware?

No , Nvidia GPU and intel core

@Pedrodilamuerte
Copy link
Author

yep it works but I get black images ^^'

Remove --disable-nan-check,and try again

didn't work sadly ...

@dhwz
Copy link
Contributor

dhwz commented Aug 3, 2023

Thats odd you shouldn't get NaNs with Nvidia hardware, which GPU?
Please check that you haven't set --precision full

@Pedrodilamuerte
Copy link
Author

I have a RTX 3060 , and no I didn't. I made a fresh install but now I get the error

OutOfMemoryError: CUDA out of memory. Tried to allocate 26.00 MiB (GPU 0; 12.00 GiB total capacity; 11.06 GiB already allocated; 0 bytes free; 11.30 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF

I didn't even get it before ...

@Pedrodilamuerte
Copy link
Author

It does work now, but only at 1024X1024, not more , but that's regardless of the use of the extension ...

@dhwz
Copy link
Contributor

dhwz commented Aug 7, 2023

Try --medvram

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants