Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]Found at least two devices, cuda:0 and cpu! #63

Open
yeungmozhu opened this issue Jun 15, 2024 · 8 comments
Open

[Bug]Found at least two devices, cuda:0 and cpu! #63

yeungmozhu opened this issue Jun 15, 2024 · 8 comments

Comments

@yeungmozhu
Copy link

No description provided.

@yeungmozhu yeungmozhu changed the title Same problem, selecting cpu in T5 loader node and getting this error when running to ksample node. Expected all tensors to be on the same device, but found at least two devices, cuda:0 and cpu! Selecting cpu in T5 loader node and getting this error when running to ksample node. Expected all tensors to be on the same device, but found at least two devices, cuda:0 and cpu! Jun 15, 2024
@yeungmozhu
Copy link
Author

Selecting cpu in T5 loader node and getting this error when running to ksample node. Expected all tensors to be on the same device, but found at least two devices, cuda:0 and cpu!

@yeungmozhu yeungmozhu changed the title Selecting cpu in T5 loader node and getting this error when running to ksample node. Expected all tensors to be on the same device, but found at least two devices, cuda:0 and cpu! [Bug]Found at least two devices, cuda:0 and cpu! Jun 15, 2024
@ReyJ94
Copy link

ReyJ94 commented Jun 16, 2024

having the exact problem when trying your implementation of hunyuan dit

@lger4567
Copy link

File "D:\ComfyUI_2\custom_nodes\ComfyUI_ExtraModels\HunYuanDiT\models\poolers.py", line 19, in forward
x = x + self.positional_embedding[:, None, :].to(x.dtype) # (L+1)NC
~~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
RuntimeError: Expected all tensors to be on the same device, but found at least two devices, cuda:0 and cpu!
看到几乎相同的问题曾经提出并已被关闭,我重启comfyui并没有解决。

@city96
Copy link
Owner

city96 commented Jun 16, 2024

Update and try again? I suspect it will still fail on low vram machines but maybe it'll work.

@lger4567
Copy link

昨天刚刚更新了本体和插件,看到别人8G显存可以跑呀。内存也足够大,不知是不是本体更新导致的,还是什么原因,感谢您的回复,希望有时间时看一下,非常感谢。

@city96
Copy link
Owner

city96 commented Jun 16, 2024

昨天

I've pushed changes ~30 minutes ago ( 5101719 ) that might fix the issue, so please update again. If you see loading in lowvram mode in the console then it will still error due to not having enough VRAM and not being able to partially offload the model.

@lger4567
Copy link

city96,非常感谢这么快地回复,明天我再试一下,晚安。

@lger4567
Copy link

更新后可以跑通了,很流畅。非常棒,再次感谢。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants