-
-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Stable Diffusion 3.5 missing parameters when generating toml file : Blocks to swap and Fused Backward Pass #3094
Comments
This is really a question for Kohya. I don't really train SD models anymore... hence why support for updates is lacking. I will try to see why the other two parameters are not used... if they are in the GUI they should probably take effect, unless I made a mistake with the code and they are not properly handled. |
I managed to make clip l and clip g training via extra arguments probably block swap will work too didnt test yet. but ye gui didnt write into toml file thanks for all support until now |
I fixed the issue with block_to_swap and fused_backward_pass. Just pushed an update. Regarding clip l and clip g parameters, are they also missing in the sd3 lora gui? WHat are the actual parameter missing that you manually passed as extra arguments? |
i am doing dreambooth training and yes how it is provided is te1 te2 and te3 here extra params i used and worked
|
Ah... I fixed the Lora... Dreambooth might still have issues... I will have to look into that one too. |
block_to_swap and fused_backward_pass should work now for Dreambooth. I will tackle the learning rates tomorrow |
thank you so much also do you have any ideas about this? |
I am starting to research Stable Diffusion 3.5 Large training
Currently Fused Backward Pass and Blocks to swap are not saved into toml file thus makes 0 difference
Also does Stable Diffusion 3.5 Large supports Clip Large or T5 XXL or Clip G training?
Thank you so much for fixes and info @bmaltais
The text was updated successfully, but these errors were encountered: