-
Notifications
You must be signed in to change notification settings - Fork 40
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature Request] Support for Lumina #56
Comments
I've had a play around with Lumia as well as Large-DiT in the past, but they were pretty undertrained. Looks like it got updated recently and no longer uses llama as the text model, making it actually possible to run lol. Might take a shot at it but not a priority at the moment. |
So it looks like they just released a new version of Lumina, Lumina-Next-SFT which is another step better. I wasn't majorly impressed with the previous Lumina, but this one is actually rather impressive. |
Lumina-Next-SFT looks amazing and would be awesome if comfyui would get support for it |
Agree - please add support for this. It has so much promise. |
please add support for this. |
https://github.com/Alpha-VLLM/Lumina-T2X
This looks like a promising variation on Text to Anything. It'd be nice to get support for it as at the moment it's just gradio demos or python code.
The text was updated successfully, but these errors were encountered: