-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
3221225477, can't load checkpoints sucessed. #1
Comments
use base flux dev / schnell instead of quantized version currently the codebase does not support loading the fp8 weight directly |
also can you share the entire error log? |
Yes, I try the https://huggingface.co/black-forest-labs/FLUX.1-dev/blob/main/flux1-dev.safetensors model file, the same error. |
I know the node should load the model, merge and convert it in memory, but may be the dev model had loaded, the error has occurred before merge and convert. are there any requirements for a GPU? For example, BF16 must be supported? |
It seems like you're trying to create a checkpoint which isn't going to work. The model has a completely different architecture hence the requirement for this custom node in the first place. Is there a particular reason for as to why you're trying to save the checkpoint here? |
No, Just for test, due to run your workflow error and stop on "FluxModCheckpointLoader" node. |
What GPU are you currently using and how much VRAM does it have? I don't see anything out of the ordinary in the logs that you provided earlier. We're going to need more detailed information than what you gave. |
Nvidia P40, 24GB VRAM. but as i looked, the node load to CPU first, may be don't start to convert and transfer to GPU, the error occurred. |
Ah yes, that would be it since currently the model is only setup for bf16 at the moment which requires a higher compute capability (I think 8.0?) whilst the P40 would only support fp16. I'm not sure if ComfyUI is doing an on the fly conversion for Flux since the model is bf16 on Hugging Face. I'm guessing default may be either bf16 or fp16. |
Yes, ComfyUI can do the fly conversion from bf16 to fp16 or fp8e4m3...... |
test load flux.1 dev bf16 and fp8 safetensors file with your "universal_modulator.safetensors" together in "FluxModCheckpointLoader" node, can't load sucessed. error message is:3221225477, webui lose connect with server, need to reatsrt ComfyUI server.
The text was updated successfully, but these errors were encountered: