Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issue loading a PEFT lora model #70

Open
frutiemax92 opened this issue Jul 1, 2024 · 3 comments
Open

Issue loading a PEFT lora model #70

frutiemax92 opened this issue Jul 1, 2024 · 3 comments

Comments

@frutiemax92
Copy link
Contributor

frutiemax92 commented Jul 1, 2024

When trying to load this PEFT Lora model I have multiple issues.
https://www.dropbox.com/scl/fi/30y9yn26ao8pnwch7z1ex/test_lora.zip?rlkey=r6kvgzwvrqm9tnw4jz8ctgu2f&st=x8r69pb3&dl=0

The first is that the code crashes at the line k = new_modelpatcher.add_patches(loaded, strength) in lora.py when using the automatic cfg node after loading the lora.

Also, when I am trying to load the Lora, it doesn't find all the layers it needs to patch i.e. only 14 layers out of 574of them... There is an image generating but it doesn't load the lora correctly!

PixArt: LoRA conversion has leftover keys! (14 vs 574)
['transformer_blocks.0.attn1.to_k.lora_A.weight', 'transformer_blocks.8.attn2.to_q.lora_B.weight', 'transformer_blocks.1.attn1.to_q.lora_B.weight', 'transformer_blocks.23.attn1.to_out.0.lora_A.weight', 'transformer_blocks.15.attn1.to_k.lora_A.weight', 'transformer_blocks.19.attn2.to_k.lora_B.weight', 'transformer_blocks.4.attn1.to_v.lora_A.weight', 'transformer_blocks.22.attn2.to_out.0.lora_A.weight', 'transformer_blocks.7.ff.net.0.proj.lora_B.weight', 'transformer_blocks.9.attn1.to_out.0.lora_A.weight', 'transformer_blocks.7.attn1.to_k.lora_B.weight', 'transformer_blocks.16.attn1.to_out.0.lora_A.weight', 'transformer_blocks.2.attn2.to_out.0.lora_A.weight', 'transformer_blocks.3.attn2.to_k.lora_A.weight', 'transformer_blocks.9.attn2.to_q.lora_B.weight', 'transformer_blocks.1.attn2.to_k.lora_B.weight', 'transformer_blocks.26.attn2.to_q.lora_B.weight', 'transformer_blocks.10.attn1.to_v.lora_A.weight', 'transformer_blocks.2.attn1.to_out.0.lora_B.weight', 'transformer_blocks.5.attn1.to_q.lora_A.weight', 'transformer_blocks.25.attn1.to_k.lora_B.weight', 'transformer_blocks.9.attn1.to_k.lora_B.weight', 'transformer_blocks.12.attn1.to_k.lora_B.weight', 'transformer_blocks.25.ff.net.0.proj.lora_B.weight', 'transformer_blocks.27.attn1.to_v.lora_A.weight', 'transformer_blocks.1.attn1.to_out.0.lora_A.weight', 'transformer_blocks.8.ff.net.2.lora_A.weight', 'transformer_blocks.18.attn1.to_out.0.lora_B.weight', 'transformer_blocks.12.attn2.to_v.lora_A.weight', 'transformer_blocks.0.attn1.to_out.0.lora_B.weight', 'transformer_blocks.11.attn2.to_q.lora_A.weight', 'transformer_blocks.9.ff.net.2.lora_B.weight', 'transformer_blocks.7.attn1.to_q.lora_A.weight', 'transformer_blocks.11.attn1.to_v.lora_A.weight', 'transformer_blocks.20.attn2.to_out.0.lora_A.weight', 'transformer_blocks.26.attn2.to_q.lora_A.weight', 'transformer_blocks.13.attn2.to_q.lora_B.weight', 'transformer_blocks.6.attn2.to_k.lora_B.weight', 'transformer_blocks.15.attn1.to_v.lora_A.weight', 'transformer_blocks.16.attn2.to_out.0.lora_B.weight', 'transformer_blocks.27.attn2.to_k.lora_B.weight', 'transformer_blocks.7.attn2.to_out.0.lora_A.weight', 'transformer_blocks.7.attn2.to_out.0.lora_B.weight', 'transformer_blocks.20.attn1.to_v.lora_B.weight', 'transformer_blocks.25.attn2.to_q.lora_B.weight', 'transformer_blocks.17.ff.net.0.proj.lora_B.weight', 'transformer_blocks.15.ff.net.2.lora_B.weight', 'transformer_blocks.8.attn2.to_v.lora_B.weight', 'transformer_blocks.19.ff.net.2.lora_B.weight', 'transformer_blocks.11.attn2.to_k.lora_B.weight', 'transformer_blocks.13.attn2.to_out.0.lora_B.weight', 'transformer_blocks.5.ff.net.2.lora_A.weight', 'transformer_blocks.6.attn1.to_out.0.lora_B.weight', 'transformer_blocks.12.attn1.to_out.0.lora_A.weight', 'transformer_blocks.19.attn2.to_q.lora_A.weight', 'transformer_blocks.2.ff.net.2.lora_A.weight', 'transformer_blocks.0.attn2.to_out.0.lora_B.weight', 'transformer_blocks.10.ff.net.0.proj.lora_A.weight', 'transformer_blocks.15.attn1.to_q.lora_B.weight', 'transformer_blocks.17.ff.net.0.proj.lora_A.weight', 'transformer_blocks.22.ff.net.2.lora_B.weight', 'transformer_blocks.6.attn1.to_q.lora_A.weight', 'transformer_blocks.3.attn1.to_q.lora_B.weight', 'transformer_blocks.17.attn2.to_q.lora_A.weight', 'transformer_blocks.25.attn1.to_k.lora_A.weight', 'transformer_blocks.4.attn1.to_out.0.lora_A.weight', 'transformer_blocks.4.attn2.to_q.lora_A.weight', 'transformer_blocks.9.attn1.to_v.lora_A.weight', 'transformer_blocks.1.attn2.to_q.lora_A.weight', 'transformer_blocks.1.attn1.to_v.lora_A.weight', 'transformer_blocks.18.attn2.to_out.0.lora_A.weight', 'transformer_blocks.6.attn1.to_v.lora_B.weight', 'transformer_blocks.8.attn2.to_out.0.lora_A.weight', 'transformer_blocks.0.attn1.to_out.0.lora_A.weight', 'transformer_blocks.6.attn1.to_out.0.lora_A.weight', 'transformer_blocks.21.ff.net.2.lora_A.weight', 'transformer_blocks.17.attn1.to_q.lora_B.weight', 'transformer_blocks.22.attn2.to_k.lora_A.weight', 'transformer_blocks.25.attn2.to_v.lora_A.weight', 'transformer_blocks.23.attn2.to_v.lora_A.weight', 'transformer_blocks.15.attn2.to_q.lora_B.weight', 'transformer_blocks.27.ff.net.2.lora_A.weight', 'transformer_blocks.13.attn1.to_q.lora_B.weight', 'transformer_blocks.15.ff.net.2.lora_A.weight', 'transformer_blocks.16.attn1.to_v.lora_A.weight', 'transformer_blocks.12.ff.net.2.lora_B.weight', 'transformer_blocks.16.attn1.to_q.lora_A.weight', 'transformer_blocks.17.attn2.to_v.lora_B.weight', 'transformer_blocks.11.ff.net.2.lora_B.weight', 'transformer_blocks.1.attn2.to_out.0.lora_B.weight', 'transformer_blocks.23.attn2.to_q.lora_B.weight', 'transformer_blocks.27.attn1.to_k.lora_A.weight', 'transformer_blocks.23.attn1.to_v.lora_A.weight', 'transformer_blocks.3.attn1.to_v.lora_B.weight', 'transformer_blocks.24.attn2.to_q.lora_B.weight', 'transformer_blocks.2.attn1.to_v.lora_A.weight', 'transformer_blocks.1.attn2.to_q.lora_B.weight', 'transformer_blocks.18.ff.net.0.proj.lora_B.weight', 'transformer_blocks.24.attn2.to_v.lora_A.weight', 'transformer_blocks.27.attn2.to_v.lora_B.weight', 'transformer_blocks.6.ff.net.0.proj.lora_A.weight', 'transformer_blocks.8.attn2.to_out.0.lora_B.weight', 'transformer_blocks.23.ff.net.0.proj.lora_B.weight', 'transformer_blocks.24.attn1.to_q.lora_B.weight', 'transformer_blocks.26.attn1.to_v.lora_B.weight', 'transformer_blocks.0.attn1.to_q.lora_A.weight', 'transformer_blocks.13.attn2.to_q.lora_A.weight', 'transformer_blocks.20.attn1.to_out.0.lora_B.weight', 'transformer_blocks.7.attn1.to_out.0.lora_B.weight', 'transformer_blocks.27.attn1.to_out.0.lora_A.weight', 'transformer_blocks.4.ff.net.0.proj.lora_B.weight', 'transformer_blocks.10.attn1.to_q.lora_A.weight', 'transformer_blocks.1.attn2.to_k.lora_A.weight', 'transformer_blocks.15.attn2.to_out.0.lora_A.weight', 'transformer_blocks.0.attn2.to_v.lora_B.weight', 'transformer_blocks.13.attn1.to_v.lora_B.weight', 'transformer_blocks.3.ff.net.2.lora_A.weight', 'transformer_blocks.9.attn1.to_q.lora_A.weight', 'transformer_blocks.14.ff.net.2.lora_B.weight', 'transformer_blocks.18.attn1.to_v.lora_B.weight', 'transformer_blocks.23.attn2.to_out.0.lora_A.weight', 'transformer_blocks.16.attn2.to_q.lora_B.weight', 'transformer_blocks.27.attn1.to_k.lora_B.weight', 'transformer_blocks.3.attn1.to_v.lora_A.weight', 'transformer_blocks.27.attn1.to_v.lora_B.weight', 'transformer_blocks.11.attn1.to_q.lora_A.weight', 'transformer_blocks.15.attn2.to_q.lora_A.weight', 'transformer_blocks.1.ff.net.2.lora_B.weight', 'transformer_blocks.23.attn1.to_k.lora_A.weight', 'transformer_blocks.26.attn1.to_q.lora_A.weight', 'transformer_blocks.4.attn2.to_k.lora_B.weight', 'transformer_blocks.3.attn2.to_q.lora_B.weight', 'transformer_blocks.9.attn2.to_out.0.lora_A.weight', 'transformer_blocks.20.attn2.to_v.lora_A.weight', 'transformer_blocks.24.ff.net.2.lora_A.weight', 'transformer_blocks.11.attn1.to_out.0.lora_A.weight', 'transformer_blocks.3.attn2.to_v.lora_B.weight', 'transformer_blocks.21.attn1.to_q.lora_A.weight', 'transformer_blocks.17.ff.net.2.lora_B.weight', 'transformer_blocks.15.attn1.to_k.lora_B.weight', 'transformer_blocks.25.attn1.to_v.lora_B.weight', 'transformer_blocks.22.ff.net.0.proj.lora_A.weight', 'transformer_blocks.25.attn1.to_out.0.lora_B.weight', 'transformer_blocks.5.attn2.to_v.lora_A.weight', 'transformer_blocks.16.attn2.to_k.lora_B.weight', 'transformer_blocks.26.attn1.to_k.lora_B.weight', 'transformer_blocks.3.attn2.to_k.lora_B.weight', 'transformer_blocks.10.attn1.to_out.0.lora_B.weight', 'transformer_blocks.19.attn2.to_k.lora_A.weight', 'transformer_blocks.6.attn2.to_v.lora_A.weight', 'transformer_blocks.12.attn2.to_v.lora_B.weight', 'transformer_blocks.21.attn1.to_v.lora_A.weight', 'transformer_blocks.7.ff.net.0.proj.lora_A.weight', 'transformer_blocks.24.attn2.to_v.lora_B.weight', 'transformer_blocks.4.ff.net.2.lora_A.weight', 'transformer_blocks.15.attn2.to_v.lora_B.weight', 'transformer_blocks.17.attn1.to_v.lora_A.weight', 'transformer_blocks.2.attn2.to_k.lora_B.weight', 'transformer_blocks.5.attn1.to_v.lora_A.weight', 'transformer_blocks.2.attn2.to_q.lora_B.weight', 'transformer_blocks.25.attn2.to_q.lora_A.weight', 'transformer_blocks.26.ff.net.2.lora_B.weight', 'transformer_blocks.23.attn1.to_q.lora_A.weight', 'transformer_blocks.16.attn2.to_k.lora_A.weight', 'transformer_blocks.19.attn1.to_k.lora_A.weight', 'transformer_blocks.12.ff.net.0.proj.lora_B.weight', 'transformer_blocks.20.ff.net.0.proj.lora_B.weight', 'transformer_blocks.16.attn2.to_v.lora_B.weight', 'transformer_blocks.21.attn1.to_out.0.lora_A.weight', 'transformer_blocks.21.attn2.to_v.lora_B.weight', 'transformer_blocks.16.attn1.to_out.0.lora_B.weight', 'transformer_blocks.9.attn2.to_q.lora_A.weight', 'transformer_blocks.5.ff.net.0.proj.lora_A.weight', 'transformer_blocks.26.attn2.to_k.lora_B.weight', 'transformer_blocks.19.attn2.to_v.lora_B.weight', 'transformer_blocks.17.attn2.to_out.0.lora_B.weight', 'transformer_blocks.6.attn1.to_q.lora_B.weight', 'transformer_blocks.21.attn1.to_q.lora_B.weight', 'transformer_blocks.26.attn1.to_k.lora_A.weight', 'transformer_blocks.9.attn2.to_k.lora_A.weight', 'transformer_blocks.1.ff.net.0.proj.lora_A.weight', 'transformer_blocks.22.attn1.to_out.0.lora_A.weight', 'transformer_blocks.11.ff.net.2.lora_A.weight', 'transformer_blocks.0.ff.net.2.lora_B.weight', 'transformer_blocks.14.attn1.to_v.lora_A.weight', 'transformer_blocks.7.attn1.to_v.lora_A.weight', 'transformer_blocks.1.ff.net.2.lora_A.weight', 'transformer_blocks.13.attn1.to_out.0.lora_A.weight', 'transformer_blocks.20.ff.net.2.lora_A.weight', 'transformer_blocks.2.ff.net.0.proj.lora_B.weight', 'transformer_blocks.11.attn2.to_out.0.lora_B.weight', 'transformer_blocks.8.attn1.to_out.0.lora_B.weight', 'transformer_blocks.4.attn2.to_v.lora_A.weight', 'transformer_blocks.20.attn2.to_q.lora_A.weight', 'transformer_blocks.1.attn2.to_v.lora_B.weight', 'transformer_blocks.14.attn2.to_q.lora_B.weight', 'transformer_blocks.23.attn1.to_q.lora_B.weight', 'transformer_blocks.26.attn1.to_out.0.lora_A.weight', 'transformer_blocks.16.attn1.to_k.lora_A.weight', 'transformer_blocks.9.attn1.to_v.lora_B.weight', 'transformer_blocks.22.attn2.to_v.lora_B.weight', 'transformer_blocks.3.attn2.to_q.lora_A.weight', 'transformer_blocks.24.attn1.to_q.lora_A.weight', 'transformer_blocks.18.attn2.to_v.lora_A.weight', 'transformer_blocks.9.ff.net.0.proj.lora_B.weight', 'transformer_blocks.25.attn2.to_k.lora_A.weight', 'transformer_blocks.12.attn2.to_k.lora_A.weight', 'transformer_blocks.20.attn1.to_v.lora_A.weight', 'transformer_blocks.17.attn2.to_k.lora_B.weight', 'transformer_blocks.16.ff.net.2.lora_B.weight', 'transformer_blocks.0.attn2.to_v.lora_A.weight', 'transformer_blocks.17.attn1.to_q.lora_A.weight', 'transformer_blocks.18.attn2.to_v.lora_B.weight', 'transformer_blocks.7.ff.net.2.lora_B.weight', 'transformer_blocks.0.attn2.to_out.0.lora_A.weight', 'transformer_blocks.20.attn2.to_q.lora_B.weight', 'transformer_blocks.25.attn2.to_v.lora_B.weight', 'transformer_blocks.26.attn2.to_v.lora_B.weight', 'transformer_blocks.3.ff.net.0.proj.lora_B.weight', 'transformer_blocks.11.attn1.to_v.lora_B.weight', 'transformer_blocks.8.attn2.to_v.lora_A.weight', 'transformer_blocks.10.ff.net.2.lora_B.weight', 'transformer_blocks.7.ff.net.2.lora_A.weight', 'transformer_blocks.17.attn1.to_out.0.lora_B.weight', 'transformer_blocks.24.ff.net.0.proj.lora_A.weight', 'transformer_blocks.21.attn2.to_v.lora_A.weight', 'transformer_blocks.14.attn2.to_v.lora_B.weight', 'transformer_blocks.13.attn2.to_k.lora_A.weight', 'transformer_blocks.24.attn1.to_k.lora_A.weight', 'transformer_blocks.23.attn1.to_k.lora_B.weight', 'transformer_blocks.23.ff.net.2.lora_A.weight', 'transformer_blocks.3.ff.net.2.lora_B.weight', 'transformer_blocks.14.attn1.to_k.lora_A.weight', 'transformer_blocks.23.attn2.to_k.lora_B.weight', 'transformer_blocks.19.attn1.to_out.0.lora_A.weight', 'transformer_blocks.6.attn2.to_v.lora_B.weight', 'transformer_blocks.1.attn2.to_out.0.lora_A.weight', 'transformer_blocks.19.attn1.to_v.lora_A.weight', 'transformer_blocks.2.attn1.to_out.0.lora_A.weight', 'transformer_blocks.16.attn1.to_v.lora_B.weight', 'transformer_blocks.13.ff.net.0.proj.lora_B.weight', 'transformer_blocks.11.attn1.to_out.0.lora_B.weight', 'transformer_blocks.25.attn1.to_v.lora_A.weight', 'transformer_blocks.0.attn2.to_k.lora_B.weight', 'transformer_blocks.12.ff.net.0.proj.lora_A.weight', 'transformer_blocks.15.ff.net.0.proj.lora_B.weight', 'transformer_blocks.6.ff.net.0.proj.lora_B.weight', 'transformer_blocks.10.attn2.to_out.0.lora_A.weight', 'transformer_blocks.14.attn2.to_out.0.lora_B.weight', 'transformer_blocks.20.attn2.to_out.0.lora_B.weight', 'transformer_blocks.15.attn2.to_v.lora_A.weight', 'transformer_blocks.18.attn1.to_out.0.lora_A.weight', 'transformer_blocks.5.attn2.to_k.lora_B.weight', 'transformer_blocks.8.attn1.to_k.lora_B.weight', 'transformer_blocks.25.ff.net.0.proj.lora_A.weight', 'transformer_blocks.17.attn2.to_out.0.lora_A.weight', 'transformer_blocks.18.attn1.to_q.lora_B.weight', 'transformer_blocks.0.ff.net.0.proj.lora_A.weight', 'transformer_blocks.17.attn2.to_k.lora_A.weight', 'transformer_blocks.26.ff.net.2.lora_A.weight', 'transformer_blocks.4.ff.net.0.proj.lora_A.weight', 'transformer_blocks.15.ff.net.0.proj.lora_A.weight', 'transformer_blocks.10.ff.net.0.proj.lora_B.weight', 'transformer_blocks.8.attn1.to_v.lora_A.weight', 'transformer_blocks.14.attn1.to_q.lora_A.weight', 'transformer_blocks.22.attn1.to_v.lora_A.weight', 'transformer_blocks.6.attn2.to_q.lora_A.weight', 'transformer_blocks.6.attn1.to_v.lora_A.weight', 'transformer_blocks.2.attn2.to_out.0.lora_B.weight', 'transformer_blocks.19.ff.net.2.lora_A.weight', 'transformer_blocks.25.attn1.to_out.0.lora_A.weight', 'transformer_blocks.2.attn1.to_v.lora_B.weight', 'transformer_blocks.6.attn2.to_out.0.lora_A.weight', 'transformer_blocks.7.attn2.to_k.lora_A.weight', 'transformer_blocks.10.attn2.to_v.lora_A.weight', 'transformer_blocks.24.attn2.to_out.0.lora_A.weight', 'transformer_blocks.1.ff.net.0.proj.lora_B.weight', 'transformer_blocks.4.attn1.to_v.lora_B.weight', 'transformer_blocks.26.attn1.to_q.lora_B.weight', 'transformer_blocks.18.attn2.to_k.lora_B.weight', 'transf
PixArt: LoRA conversion has missing keys! (probably)
@frutiemax92
Copy link
Contributor Author

frutiemax92 commented Jul 1, 2024

Replaced the get_lora_depth function but the ff and to_out layers are still not handled correctly.

PixArt: LoRA conversion has leftover keys! (350 vs 574)
['transformer_blocks.14.ff.net.0.proj.lora_A.weight', 'transformer_blocks.18.ff.net.2.lora_B.weight', 'transformer_blocks.7.ff.net.0.proj.lora_A.weight', 'transformer_blocks.7.attn1.to_out.0.lora_A.weight', 'transformer_blocks.4.attn1.to_out.0.lora_A.weight', 'transformer_blocks.0.ff.net.0.proj.lora_B.weight', 'transformer_blocks.21.ff.net.2.lora_A.weight', 'transformer_blocks.9.ff.net.0.proj.lora_B.weight', 'transformer_blocks.11.ff.net.2.lora_A.weight', 'transformer_blocks.17.ff.net.2.lora_B.weight', 'transformer_blocks.24.ff.net.2.lora_B.weight', 'transformer_blocks.5.attn1.to_out.0.lora_B.weight', 'transformer_blocks.26.ff.net.2.lora_A.weight', 'transformer_blocks.19.attn1.to_out.0.lora_B.weight', 'transformer_blocks.19.attn2.to_out.0.lora_B.weight', 'transformer_blocks.23.ff.net.2.lora_A.weight', 'transformer_blocks.2.ff.net.0.proj.lora_A.weight', 'transformer_blocks.18.attn1.to_out.0.lora_A.weight', 'transformer_blocks.8.ff.net.2.lora_B.weight', 'transformer_blocks.23.attn1.to_out.0.lora_B.weight', 'transformer_blocks.4.ff.net.0.proj.lora_B.weight', 'transformer_blocks.2.ff.net.2.lora_A.weight', 'transformer_blocks.26.attn1.to_out.0.lora_B.weight', 'transformer_blocks.5.attn1.to_out.0.lora_A.weight', 'transformer_blocks.5.attn2.to_out.0.lora_A.weight', 'transformer_blocks.20.ff.net.0.proj.lora_A.weight', 'transformer_blocks.15.ff.net.2.lora_B.weight', 'transformer_blocks.4.ff.net.2.lora_A.weight', 'transformer_blocks.10.attn1.to_out.0.lora_A.weight', 'transformer_blocks.26.attn1.to_out.0.lora_A.weight', 'transformer_blocks.16.attn2.to_out.0.lora_B.weight', 'transformer_blocks.23.ff.net.0.proj.lora_A.weight', 'transformer_blocks.13.attn2.to_out.0.lora_B.weight', 'transformer_blocks.8.ff.net.0.proj.lora_A.weight', 'transformer_blocks.27.ff.net.2.lora_B.weight', 'transformer_blocks.24.attn1.to_out.0.lora_B.weight', 'transformer_blocks.15.attn2.to_out.0.lora_B.weight', 'transformer_blocks.12.ff.net.0.proj.lora_B.weight', 'transformer_blocks.16.ff.net.0.proj.lora_A.weight', 'transformer_blocks.23.ff.net.0.proj.lora_B.weight', 'transformer_blocks.16.ff.net.2.lora_A.weight', 'transformer_blocks.8.attn1.to_out.0.lora_B.weight', 'transformer_blocks.22.ff.net.2.lora_B.weight', 'transformer_blocks.11.ff.net.0.proj.lora_B.weight', 'transformer_blocks.26.attn2.to_out.0.lora_B.weight', 'transformer_blocks.9.attn1.to_out.0.lora_B.weight', 'transformer_blocks.10.attn2.to_out.0.lora_A.weight', 'transformer_blocks.17.ff.net.0.proj.lora_B.weight', 'transformer_blocks.21.attn1.to_out.0.lora_B.weight', 'transformer_blocks.15.attn2.to_out.0.lora_A.weight', 'transformer_blocks.3.ff.net.2.lora_A.weight', 'transformer_blocks.7.attn1.to_out.0.lora_B.weight', 'transformer_blocks.14.attn2.to_out.0.lora_A.weight', 'transformer_blocks.16.attn1.to_out.0.lora_A.weight', 'transformer_blocks.12.ff.net.0.proj.lora_A.weight', 'transformer_blocks.23.attn1.to_out.0.lora_A.weight', 'transformer_blocks.20.attn1.to_out.0.lora_B.weight', 'transformer_blocks.3.attn2.to_out.0.lora_A.weight', 'transformer_blocks.26.ff.net.0.proj.lora_A.weight', 'transformer_blocks.19.ff.net.2.lora_A.weight', 'transformer_blocks.13.ff.net.2.lora_A.weight', 'transformer_blocks.7.ff.net.2.lora_B.weight', 'transformer_blocks.8.ff.net.0.proj.lora_B.weight', 'transformer_blocks.10.ff.net.0.proj.lora_A.weight', 'transformer_blocks.16.ff.net.0.proj.lora_B.weight', 'transformer_blocks.24.ff.net.0.proj.lora_B.weight', 'transformer_blocks.10.ff.net.0.proj.lora_B.weight', 'transformer_blocks.2.attn1.to_out.0.lora_A.weight', 'transformer_blocks.23.ff.net.2.lora_B.weight', 'transformer_blocks.9.ff.net.0.proj.lora_A.weight', 'transformer_blocks.12.attn2.to_out.0.lora_B.weight', 'transformer_blocks.1.attn1.to_out.0.lora_B.weight', 'transformer_blocks.10.attn1.to_out.0.lora_B.weight', 'transformer_blocks.9.attn2.to_out.0.lora_A.weight', 'transformer_blocks.22.attn1.to_out.0.lora_A.weight', 'transformer_blocks.20.attn2.to_out.0.lora_A.weight', 'transformer_blocks.4.attn2.to_out.0.lora_B.weight', 'transformer_blocks.6.ff.net.0.proj.lora_B.weight', 'transformer_blocks.0.ff.net.2.lora_B.weight', 'transformer_blocks.27.ff.net.0.proj.lora_A.weight', 'transformer_blocks.1.attn2.to_out.0.lora_A.weight', 'transformer_blocks.14.ff.net.2.lora_B.weight', 'transformer_blocks.14.attn1.to_out.0.lora_A.weight', 'transformer_blocks.13.attn2.to_out.0.lora_A.weight', 'transformer_blocks.7.attn2.to_out.0.lora_A.weight', 'transformer_blocks.21.attn2.to_out.0.lora_A.weight', 'transformer_blocks.9.ff.net.2.lora_A.weight', 'transformer_blocks.11.ff.net.0.proj.lora_A.weight', 'transformer_blocks.6.ff.net.2.lora_B.weight', 'transformer_blocks.18.ff.net.0.proj.lora_B.weight', 'transformer_blocks.13.attn1.to_out.0.lora_B.weight', 'transformer_blocks.6.attn2.to_out.0.lora_A.weight', 'transformer_blocks.0.attn1.to_out.0.lora_A.weight', 'transformer_blocks.11.attn2.to_out.0.lora_A.weight', 'transformer_blocks.5.ff.net.2.lora_A.weight', 'transformer_blocks.1.attn1.to_out.0.lora_A.weight', 'transformer_blocks.25.ff.net.0.proj.lora_A.weight', 'transformer_blocks.19.attn1.to_out.0.lora_A.weight', 'transformer_blocks.23.attn2.to_out.0.lora_B.weight', 'transformer_blocks.10.ff.net.2.lora_A.weight', 'transformer_blocks.4.ff.net.2.lora_B.weight', 'transformer_blocks.16.ff.net.2.lora_B.weight', 'transformer_blocks.13.attn1.to_out.0.lora_A.weight', 'transformer_blocks.3.ff.net.0.proj.lora_A.weight', 'transformer_blocks.4.attn2.to_out.0.lora_A.weight', 'transformer_blocks.3.attn2.to_out.0.lora_B.weight', 'transformer_blocks.10.ff.net.2.lora_B.weight', 'transformer_blocks.6.ff.net.0.proj.lora_A.weight', 'transformer_blocks.6.ff.net.2.lora_A.weight', 'transformer_blocks.0.attn2.to_out.0.lora_B.weight', 'transformer_blocks.19.ff.net.0.proj.lora_B.weight', 'transformer_blocks.13.ff.net.0.proj.lora_A.weight', 'transformer_blocks.24.attn2.to_out.0.lora_B.weight', 'transformer_blocks.1.attn2.to_out.0.lora_B.weight', 'transformer_blocks.25.attn2.to_out.0.lora_B.weight', 'transformer_blocks.27.attn2.to_out.0.lora_B.weight', 'transformer_blocks.20.ff.net.2.lora_B.weight', 'transformer_blocks.1.ff.net.0.proj.lora_A.weight', 'transformer_blocks.3.ff.net.0.proj.lora_B.weight', 'transformer_blocks.12.attn1.to_out.0.lora_B.weight', 'transformer_blocks.24.attn1.to_out.0.lora_A.weight', 'transformer_blocks.3.attn1.to_out.0.lora_A.weight', 'transformer_blocks.9.ff.net.2.lora_B.weight', 'transformer_blocks.17.attn1.to_out.0.lora_B.weight', 'transformer_blocks.27.ff.net.0.proj.lora_B.weight', 'transformer_blocks.20.attn2.to_out.0.lora_B.weight', 'transformer_blocks.24.ff.net.0.proj.lora_A.weight', 'transformer_blocks.1.ff.net.0.proj.lora_B.weight', 'transformer_blocks.11.attn1.to_out.0.lora_A.weight', 'transformer_blocks.2.ff.net.0.proj.lora_B.weight', 'transformer_blocks.17.attn2.to_out.0.lora_A.weight', 'transformer_blocks.6.attn1.to_out.0.lora_A.weight', 'transformer_blocks.11.attn1.to_out.0.lora_B.weight', 'transformer_blocks.7.ff.net.0.proj.lora_B.weight', 'transformer_blocks.12.ff.net.2.lora_A.weight', 'transformer_blocks.16.attn1.to_out.0.lora_B.weight', 'transformer_blocks.14.ff.net.0.proj.lora_B.weight', 'transformer_blocks.25.attn1.to_out.0.lora_A.weight', 'transformer_blocks.26.ff.net.2.lora_B.weight', 'transformer_blocks.11.attn2.to_out.0.lora_B.weight', 'transformer_blocks.5.ff.net.0.proj.lora_B.weight', 'transformer_blocks.9.attn2.to_out.0.lora_B.weight', 'transformer_blocks.15.ff.net.2.lora_A.weight', 'transformer_blocks.12.attn1.to_out.0.lora_A.weight', 'transformer_blocks.15.ff.net.0.proj.lora_B.weight', 'transformer_blocks.18.ff.net.2.lora_A.weight', 'transformer_blocks.0.attn2.to_out.0.lora_A.weight', 'transformer_blocks.18.attn2.to_out.0.lora_A.weight', 'transformer_blocks.3.attn1.to_out.0.lora_B.weight', 'transformer_blocks.20.attn1.to_out.0.lora_A.weight', 'transformer_blocks.14.attn2.to_out.0.lora_B.weight', 'transformer_blocks.22.attn1.to_out.0.lora_B.weight', 'transformer_blocks.25.ff.net.0.proj.lora_B.weight', 'transformer_blocks.2.attn2.to_out.0.lora_A.weight', 'transformer_blocks.14.attn1.to_out.0.lora_B.weight', 'transformer_blocks.21.attn1.to_out.0.lora_A.weight', 'transformer_blocks.22.attn2.to_out.0.lora_B.weight', 'transformer_blocks.7.ff.net.2.lora_A.weight', 'transformer_blocks.25.attn1.to_out.0.lora_B.weight', 'transformer_blocks.12.attn2.to_out.0.lora_A.weight', 'transformer_blocks.4.ff.net.0.proj.lora_A.weight', 'transformer_blocks.24.ff.net.2.lora_A.weight', 'transformer_blocks.27.attn2.to_out.0.lora_A.weight', 'transformer_blocks.17.ff.net.0.proj.lora_A.weight', 'transformer_blocks.5.attn2.to_out.0.lora_B.weight', 'transformer_blocks.19.attn2.to_out.0.lora_A.weight', 'transformer_blocks.5.ff.net.2.lora_B.weight', 'transformer_blocks.11.ff.net.2.lora_B.weight', 'transformer_blocks.0.attn1.to_out.0.lora_B.weight', 'transformer_blocks.2.attn1.to_out.0.lora_B.weight', 'transformer_blocks.21.ff.net.2.lora_B.weight', 'transformer_blocks.17.attn2.to_out.0.lora_B.weight', 'transformer_blocks.10.attn2.to_out.0.lora_B.weight', 'transformer_blocks.14.ff.net.2.lora_A.weight', 'transformer_blocks.8.attn2.to_out.0.lora_A.weight', 'transformer_blocks.25.ff.net.2.lora_A.weight', 'transformer_blocks.6.attn1.to_out.0.lora_B.weight', 'transformer_blocks.12.ff.net.2.lora_B.weight', 'transformer_blocks.19.ff.net.0.proj.lora_A.weight', 'transformer_blocks.22.attn2.to_out.0.lora_A.weight', 'transformer_blocks.27.attn1.to_out.0.lora_A.weight', 'transformer_blocks.6.attn2.to_out.0.lora_B.weight', 'transformer_blocks.24.attn2.to_out.0.lora_A.weight', 'transformer_blocks.26.ff.net.0.proj.lora_B.weight', 'transformer_blocks.17.attn1.to_out.0.lora_A.weight', 'transformer_blocks.18.attn2.to_out.0.lora_B.weight', 'transformer_blocks.13.ff.net.0.proj.lora_B.weight', 'transformer_blocks.22.ff.net.2.lora_A.weight', 'transformer_blocks.20.ff.net.2.lora_A.weight', 'transformer_blocks.15.attn1.to_out.0.lora_B.weight', 'transformer_blocks.9.attn1.to_out.0.lora_A.weight', 'transformer_blocks.26.attn2.to_out.0.lora_A.weight', 'transformer_blocks.1.ff.net.2.lora_B.weight', 'transformer_blocks.0.ff.net.2.lora_A.weight', 'transformer_blocks.1.ff.net.2.lora_A.weight', 'transformer_blocks.13.ff.net.2.lora_B.weight', 'transformer_blocks.16.attn2.to_out.0.lora_A.weight', 'transformer_blocks.19.ff.net.2.lora_B.weight', 'transformer_blocks.23.attn2.to_out.0.lora_A.weight', 'transformer_blocks.17.ff.net.2.lora_A.weight', 'transformer_blocks.2.ff.net.2.lora_B.weight', 'transformer_blocks.20.ff.net.0.proj.lora_B.weight', 'transformer_blocks.4.attn1.to_out.0.lora_B.weight', 'transformer_blocks.22.ff.net.0.proj.lora_A.weight', 'transformer_blocks.8.ff.net.2.lora_A.weight', 'transformer_blocks.21.attn2.to_out.0.lora_B.weight', 'transformer_blocks.18.attn1.to_out.0.lora_B.weight', 'transformer_blocks.21.ff.net.0.proj.lora_A.weight', 'transformer_blocks.8.attn1.to_out.0.lora_A.weight', 'transformer_blocks.27.attn1.to_out.0.lora_B.weight', 'transformer_blocks.25.attn2.to_out.0.lora_A.weight', 'transformer_blocks.27.ff.net.2.lora_A.weight', 'transformer_blocks.5.ff.net.0.proj.lora_A.weight', 'transformer_blocks.25.ff.net.2.lora_B.weight', 'transformer_blocks.18.ff.net.0.proj.lora_A.weight', 'transformer_blocks.15.attn1.to_out.0.lora_A.weight', 'transformer_blocks.2.attn2.to_out.0.lora_B.weight', 'transformer_blocks.21.ff.net.0.proj.lora_B.weight', 'transformer_blocks.8.attn2.to_out.0.lora_B.weight', 'transformer_blocks.3.ff.net.2.lora_B.weight', 'transformer_blocks.22.ff.net.0.proj.lora_B.weight', 'transformer_blocks.0.ff.net.0.proj.lora_A.weight', 'transformer_blocks.15.ff.net.0.proj.lora_A.weight', 'transformer_blocks.7.attn2.to_out.0.lora_B.weight'] 
PixArt: LoRA conversion has missing keys! (probably)

@frutiemax92
Copy link
Contributor Author

Even with the latest merged PR, I am still getting different results between using the node and merging to a checkpoint. I believe there are some missing layers/weights still!

@frutiemax92
Copy link
Contributor Author

Here is a lora example I just published:
https://civitai.com/models/578802?modelVersionId=645548

I've seen that I need to crank the lora strength quite a bit i.e. up to 1.8 to get good results. Is there anything in the patching algorithm that needs quite a boost in the strength?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant