Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issue with loading PixArt Sigma Loras #60

Open
chrish-slingshot opened this issue Jun 11, 2024 · 18 comments
Open

Issue with loading PixArt Sigma Loras #60

chrish-slingshot opened this issue Jun 11, 2024 · 18 comments

Comments

@chrish-slingshot
Copy link

Hey all. I am getting issues when trying to load a Lora created in OneTrainer for PixArt Sigma. No matter what options I try/train the Lora with, I always get a load of warning messages in the ComfyUI console when it hits the PixArt Lora Loader node:

NOT LOADED diffusion_model.lora_transformer_transformer_blocks

The OneTrainer training run completes successfully, I've tried on various fp16/fp32/bf16 settings. Can anyone offer any guidance on this?

@city96
Copy link
Owner

city96 commented Jun 11, 2024

Hi. If you don't mind, could you share one of the LoRA files so I can implement it?

The current implementation is largely based on the peft LoRA from the example training script.

@chrish-slingshot
Copy link
Author

Will do, thanks! I'm just in the middle of a training run but I'll upload a file as soon as possible.

@chrish-slingshot
Copy link
Author

Example workflow.

lora_example.json

https://www.dropbox.com/scl/fi/43p5ym172sn65hlsm3wdu/test_lora.safetensors?rlkey=tr3dkoqfbjgvqsbdkufkxsl0x&st=gq809msv&dl=0

These are the warning messages:

NOT LOADED diffusion_model.lora_te_encoder_block_0_layer_0_SelfAttention_o.weight
NOT LOADED diffusion_model.lora_te_encoder_block_0_layer_0_SelfAttention_q.weight
NOT LOADED diffusion_model.lora_te_encoder_block_0_layer_0_SelfAttention_v.weight
NOT LOADED diffusion_model.lora_te_encoder_block_0_layer_1_DenseReluDense_wi_0.weight
NOT LOADED diffusion_model.lora_te_encoder_block_0_layer_1_DenseReluDense_wi_1.weight
NOT LOADED diffusion_model.lora_te_encoder_block_0_layer_1_DenseReluDense_wo.weight
NOT LOADED diffusion_model.lora_te_encoder_block_10_layer_0_SelfAttention_k.weight
NOT LOADED diffusion_model.lora_te_encoder_block_10_layer_0_SelfAttention_o.weight
NOT LOADED diffusion_model.lora_te_encoder_block_10_layer_0_SelfAttention_q.weight
NOT LOADED diffusion_model.lora_te_encoder_block_10_layer_0_SelfAttention_v.weight
NOT LOADED diffusion_model.lora_te_encoder_block_10_layer_1_DenseReluDense_wi_0.weight
NOT LOADED diffusion_model.lora_te_encoder_block_10_layer_1_DenseReluDense_wi_1.weight
NOT LOADED diffusion_model.lora_te_encoder_block_10_layer_1_DenseReluDense_wo.weight
NOT LOADED diffusion_model.lora_te_encoder_block_11_layer_0_SelfAttention_k.weight
NOT LOADED diffusion_model.lora_te_encoder_block_11_layer_0_SelfAttention_o.weight
NOT LOADED diffusion_model.lora_te_encoder_block_11_layer_0_SelfAttention_q.weight
NOT LOADED diffusion_model.lora_te_encoder_block_11_layer_0_SelfAttention_v.weight
NOT LOADED diffusion_model.lora_te_encoder_block_11_layer_1_DenseReluDense_wi_0.weight
NOT LOADED diffusion_model.lora_te_encoder_block_11_layer_1_DenseReluDense_wi_1.weight
NOT LOADED diffusion_model.lora_te_encoder_block_11_layer_1_DenseReluDense_wo.weight
NOT LOADED diffusion_model.lora_te_encoder_block_12_layer_0_SelfAttention_k.weight
NOT LOADED diffusion_model.lora_te_encoder_block_12_layer_0_SelfAttention_o.weight
NOT LOADED diffusion_model.lora_te_encoder_block_12_layer_0_SelfAttention_q.weight
NOT LOADED diffusion_model.lora_te_encoder_block_12_layer_0_SelfAttention_v.weight
NOT LOADED diffusion_model.lora_te_encoder_block_12_layer_1_DenseReluDense_wi_0.weight
NOT LOADED diffusion_model.lora_te_encoder_block_12_layer_1_DenseReluDense_wi_1.weight
NOT LOADED diffusion_model.lora_te_encoder_block_12_layer_1_DenseReluDense_wo.weight
NOT LOADED diffusion_model.lora_te_encoder_block_13_layer_0_SelfAttention_k.weight
NOT LOADED diffusion_model.lora_te_encoder_block_13_layer_0_SelfAttention_o.weight
NOT LOADED diffusion_model.lora_te_encoder_block_13_layer_0_SelfAttention_q.weight
NOT LOADED diffusion_model.lora_te_encoder_block_13_layer_0_SelfAttention_v.weight
NOT LOADED diffusion_model.lora_te_encoder_block_13_layer_1_DenseReluDense_wi_0.weight
NOT LOADED diffusion_model.lora_te_encoder_block_13_layer_1_DenseReluDense_wi_1.weight
NOT LOADED diffusion_model.lora_te_encoder_block_13_layer_1_DenseReluDense_wo.weight
NOT LOADED diffusion_model.lora_te_encoder_block_14_layer_0_SelfAttention_k.weight
NOT LOADED diffusion_model.lora_te_encoder_block_14_layer_0_SelfAttention_o.weight
NOT LOADED diffusion_model.lora_te_encoder_block_14_layer_0_SelfAttention_q.weight
NOT LOADED diffusion_model.lora_te_encoder_block_14_layer_0_SelfAttention_v.weight
NOT LOADED diffusion_model.lora_te_encoder_block_14_layer_1_DenseReluDense_wi_0.weight
NOT LOADED diffusion_model.lora_te_encoder_block_14_layer_1_DenseReluDense_wi_1.weight
NOT LOADED diffusion_model.lora_te_encoder_block_14_layer_1_DenseReluDense_wo.weight
NOT LOADED diffusion_model.lora_te_encoder_block_15_layer_0_SelfAttention_k.weight
NOT LOADED diffusion_model.lora_te_encoder_block_15_layer_0_SelfAttention_o.weight
NOT LOADED diffusion_model.lora_te_encoder_block_15_layer_0_SelfAttention_q.weight
NOT LOADED diffusion_model.lora_te_encoder_block_15_layer_0_SelfAttention_v.weight
NOT LOADED diffusion_model.lora_te_encoder_block_15_layer_1_DenseReluDense_wi_0.weight
NOT LOADED diffusion_model.lora_te_encoder_block_15_layer_1_DenseReluDense_wi_1.weight
NOT LOADED diffusion_model.lora_te_encoder_block_15_layer_1_DenseReluDense_wo.weight
NOT LOADED diffusion_model.lora_te_encoder_block_16_layer_0_SelfAttention_k.weight
NOT LOADED diffusion_model.lora_te_encoder_block_16_layer_0_SelfAttention_o.weight
NOT LOADED diffusion_model.lora_te_encoder_block_16_layer_0_SelfAttention_q.weight
NOT LOADED diffusion_model.lora_te_encoder_block_16_layer_0_SelfAttention_v.weight
NOT LOADED diffusion_model.lora_te_encoder_block_16_layer_1_DenseReluDense_wi_0.weight
NOT LOADED diffusion_model.lora_te_encoder_block_16_layer_1_DenseReluDense_wi_1.weight
NOT LOADED diffusion_model.lora_te_encoder_block_16_layer_1_DenseReluDense_wo.weight
NOT LOADED diffusion_model.lora_te_encoder_block_17_layer_0_SelfAttention_k.weight
NOT LOADED diffusion_model.lora_te_encoder_block_17_layer_0_SelfAttention_o.weight
NOT LOADED diffusion_model.lora_te_encoder_block_17_layer_0_SelfAttention_q.weight
NOT LOADED diffusion_model.lora_te_encoder_block_17_layer_0_SelfAttention_v.weight
NOT LOADED diffusion_model.lora_te_encoder_block_17_layer_1_DenseReluDense_wi_0.weight
NOT LOADED diffusion_model.lora_te_encoder_block_17_layer_1_DenseReluDense_wi_1.weight
NOT LOADED diffusion_model.lora_te_encoder_block_17_layer_1_DenseReluDense_wo.weight
NOT LOADED diffusion_model.lora_te_encoder_block_18_layer_0_SelfAttention_k.weight
NOT LOADED diffusion_model.lora_te_encoder_block_18_layer_0_SelfAttention_o.weight
NOT LOADED diffusion_model.lora_te_encoder_block_18_layer_0_SelfAttention_q.weight
NOT LOADED diffusion_model.lora_te_encoder_block_18_layer_0_SelfAttention_v.weight
NOT LOADED diffusion_model.lora_te_encoder_block_18_layer_1_DenseReluDense_wi_0.weight
NOT LOADED diffusion_model.lora_te_encoder_block_18_layer_1_DenseReluDense_wi_1.weight
NOT LOADED diffusion_model.lora_te_encoder_block_18_layer_1_DenseReluDense_wo.weight
NOT LOADED diffusion_model.lora_te_encoder_block_19_layer_0_SelfAttention_k.weight
NOT LOADED diffusion_model.lora_te_encoder_block_19_layer_0_SelfAttention_o.weight
NOT LOADED diffusion_model.lora_te_encoder_block_19_layer_0_SelfAttention_q.weight
NOT LOADED diffusion_model.lora_te_encoder_block_19_layer_0_SelfAttention_v.weight
NOT LOADED diffusion_model.lora_te_encoder_block_19_layer_1_DenseReluDense_wi_0.weight
NOT LOADED diffusion_model.lora_te_encoder_block_19_layer_1_DenseReluDense_wi_1.weight
NOT LOADED diffusion_model.lora_te_encoder_block_19_layer_1_DenseReluDense_wo.weight
NOT LOADED diffusion_model.lora_te_encoder_block_1_layer_0_SelfAttention_k.weight
NOT LOADED diffusion_model.lora_te_encoder_block_1_layer_0_SelfAttention_o.weight
NOT LOADED diffusion_model.lora_te_encoder_block_1_layer_0_SelfAttention_q.weight
NOT LOADED diffusion_model.lora_te_encoder_block_1_layer_0_SelfAttention_v.weight
NOT LOADED diffusion_model.lora_te_encoder_block_1_layer_1_DenseReluDense_wi_0.weight
NOT LOADED diffusion_model.lora_te_encoder_block_1_layer_1_DenseReluDense_wi_1.weight
NOT LOADED diffusion_model.lora_te_encoder_block_1_layer_1_DenseReluDense_wo.weight
NOT LOADED diffusion_model.lora_te_encoder_block_20_layer_0_SelfAttention_k.weight
NOT LOADED diffusion_model.lora_te_encoder_block_20_layer_0_SelfAttention_o.weight
NOT LOADED diffusion_model.lora_te_encoder_block_20_layer_0_SelfAttention_q.weight
NOT LOADED diffusion_model.lora_te_encoder_block_20_layer_0_SelfAttention_v.weight
NOT LOADED diffusion_model.lora_te_encoder_block_20_layer_1_DenseReluDense_wi_0.weight
NOT LOADED diffusion_model.lora_te_encoder_block_20_layer_1_DenseReluDense_wi_1.weight
NOT LOADED diffusion_model.lora_te_encoder_block_20_layer_1_DenseReluDense_wo.weight
NOT LOADED diffusion_model.lora_te_encoder_block_21_layer_0_SelfAttention_k.weight
NOT LOADED diffusion_model.lora_te_encoder_block_21_layer_0_SelfAttention_o.weight
NOT LOADED diffusion_model.lora_te_encoder_block_21_layer_0_SelfAttention_q.weight
NOT LOADED diffusion_model.lora_te_encoder_block_21_layer_0_SelfAttention_v.weight
NOT LOADED diffusion_model.lora_te_encoder_block_21_layer_1_DenseReluDense_wi_0.weight
NOT LOADED diffusion_model.lora_te_encoder_block_21_layer_1_DenseReluDense_wi_1.weight
NOT LOADED diffusion_model.lora_te_encoder_block_21_layer_1_DenseReluDense_wo.weight
NOT LOADED diffusion_model.lora_te_encoder_block_22_layer_0_SelfAttention_k.weight
NOT LOADED diffusion_model.lora_te_encoder_block_22_layer_0_SelfAttention_o.weight
NOT LOADED diffusion_model.lora_te_encoder_block_22_layer_0_SelfAttention_q.weight
NOT LOADED diffusion_model.lora_te_encoder_block_22_layer_0_SelfAttention_v.weight
NOT LOADED diffusion_model.lora_te_encoder_block_22_layer_1_DenseReluDense_wi_0.weight
NOT LOADED diffusion_model.lora_te_encoder_block_22_layer_1_DenseReluDense_wi_1.weight
NOT LOADED diffusion_model.lora_te_encoder_block_22_layer_1_DenseReluDense_wo.weight
NOT LOADED diffusion_model.lora_te_encoder_block_23_layer_0_SelfAttention_k.weight
NOT LOADED diffusion_model.lora_te_encoder_block_23_layer_0_SelfAttention_o.weight
NOT LOADED diffusion_model.lora_te_encoder_block_23_layer_0_SelfAttention_q.weight
NOT LOADED diffusion_model.lora_te_encoder_block_23_layer_0_SelfAttention_v.weight
NOT LOADED diffusion_model.lora_te_encoder_block_23_layer_1_DenseReluDense_wi_0.weight
NOT LOADED diffusion_model.lora_te_encoder_block_23_layer_1_DenseReluDense_wi_1.weight
NOT LOADED diffusion_model.lora_te_encoder_block_23_layer_1_DenseReluDense_wo.weight
NOT LOADED diffusion_model.lora_te_encoder_block_2_layer_0_SelfAttention_k.weight
NOT LOADED diffusion_model.lora_te_encoder_block_2_layer_0_SelfAttention_o.weight
NOT LOADED diffusion_model.lora_te_encoder_block_2_layer_0_SelfAttention_q.weight
NOT LOADED diffusion_model.lora_te_encoder_block_2_layer_0_SelfAttention_v.weight
NOT LOADED diffusion_model.lora_te_encoder_block_2_layer_1_DenseReluDense_wi_0.weight
NOT LOADED diffusion_model.lora_te_encoder_block_2_layer_1_DenseReluDense_wi_1.weight
NOT LOADED diffusion_model.lora_te_encoder_block_2_layer_1_DenseReluDense_wo.weight
NOT LOADED diffusion_model.lora_te_encoder_block_3_layer_0_SelfAttention_k.weight
NOT LOADED diffusion_model.lora_te_encoder_block_3_layer_0_SelfAttention_o.weight
NOT LOADED diffusion_model.lora_te_encoder_block_3_layer_0_SelfAttention_q.weight
NOT LOADED diffusion_model.lora_te_encoder_block_3_layer_0_SelfAttention_v.weight
NOT LOADED diffusion_model.lora_te_encoder_block_3_layer_1_DenseReluDense_wi_0.weight
NOT LOADED diffusion_model.lora_te_encoder_block_3_layer_1_DenseReluDense_wi_1.weight
NOT LOADED diffusion_model.lora_te_encoder_block_3_layer_1_DenseReluDense_wo.weight
NOT LOADED diffusion_model.lora_te_encoder_block_4_layer_0_SelfAttention_k.weight
NOT LOADED diffusion_model.lora_te_encoder_block_4_layer_0_SelfAttention_o.weight
NOT LOADED diffusion_model.lora_te_encoder_block_4_layer_0_SelfAttention_q.weight
NOT LOADED diffusion_model.lora_te_encoder_block_4_layer_0_SelfAttention_v.weight
NOT LOADED diffusion_model.lora_te_encoder_block_4_layer_1_DenseReluDense_wi_0.weight
NOT LOADED diffusion_model.lora_te_encoder_block_4_layer_1_DenseReluDense_wi_1.weight
NOT LOADED diffusion_model.lora_te_encoder_block_4_layer_1_DenseReluDense_wo.weight
NOT LOADED diffusion_model.lora_te_encoder_block_5_layer_0_SelfAttention_k.weight
NOT LOADED diffusion_model.lora_te_encoder_block_5_layer_0_SelfAttention_o.weight
NOT LOADED diffusion_model.lora_te_encoder_block_5_layer_0_SelfAttention_q.weight
NOT LOADED diffusion_model.lora_te_encoder_block_5_layer_0_SelfAttention_v.weight
NOT LOADED diffusion_model.lora_te_encoder_block_5_layer_1_DenseReluDense_wi_0.weight
NOT LOADED diffusion_model.lora_te_encoder_block_5_layer_1_DenseReluDense_wi_1.weight
NOT LOADED diffusion_model.lora_te_encoder_block_5_layer_1_DenseReluDense_wo.weight
NOT LOADED diffusion_model.lora_te_encoder_block_6_layer_0_SelfAttention_k.weight
NOT LOADED diffusion_model.lora_te_encoder_block_6_layer_0_SelfAttention_o.weight
NOT LOADED diffusion_model.lora_te_encoder_block_6_layer_0_SelfAttention_q.weight
NOT LOADED diffusion_model.lora_te_encoder_block_6_layer_0_SelfAttention_v.weight
NOT LOADED diffusion_model.lora_te_encoder_block_6_layer_1_DenseReluDense_wi_0.weight
NOT LOADED diffusion_model.lora_te_encoder_block_6_layer_1_DenseReluDense_wi_1.weight
NOT LOADED diffusion_model.lora_te_encoder_block_6_layer_1_DenseReluDense_wo.weight
NOT LOADED diffusion_model.lora_te_encoder_block_7_layer_0_SelfAttention_k.weight
NOT LOADED diffusion_model.lora_te_encoder_block_7_layer_0_SelfAttention_o.weight
NOT LOADED diffusion_model.lora_te_encoder_block_7_layer_0_SelfAttention_q.weight
NOT LOADED diffusion_model.lora_te_encoder_block_7_layer_0_SelfAttention_v.weight
NOT LOADED diffusion_model.lora_te_encoder_block_7_layer_1_DenseReluDense_wi_0.weight
NOT LOADED diffusion_model.lora_te_encoder_block_7_layer_1_DenseReluDense_wi_1.weight
NOT LOADED diffusion_model.lora_te_encoder_block_7_layer_1_DenseReluDense_wo.weight
NOT LOADED diffusion_model.lora_te_encoder_block_8_layer_0_SelfAttention_k.weight
NOT LOADED diffusion_model.lora_te_encoder_block_8_layer_0_SelfAttention_o.weight
NOT LOADED diffusion_model.lora_te_encoder_block_8_layer_0_SelfAttention_q.weight
NOT LOADED diffusion_model.lora_te_encoder_block_8_layer_0_SelfAttention_v.weight
NOT LOADED diffusion_model.lora_te_encoder_block_8_layer_1_DenseReluDense_wi_0.weight
NOT LOADED diffusion_model.lora_te_encoder_block_8_layer_1_DenseReluDense_wi_1.weight
NOT LOADED diffusion_model.lora_te_encoder_block_8_layer_1_DenseReluDense_wo.weight
NOT LOADED diffusion_model.lora_te_encoder_block_9_layer_0_SelfAttention_k.weight
NOT LOADED diffusion_model.lora_te_encoder_block_9_layer_0_SelfAttention_o.weight
NOT LOADED diffusion_model.lora_te_encoder_block_9_layer_0_SelfAttention_q.weight
NOT LOADED diffusion_model.lora_te_encoder_block_9_layer_0_SelfAttention_v.weight
NOT LOADED diffusion_model.lora_te_encoder_block_9_layer_1_DenseReluDense_wi_0.weight
NOT LOADED diffusion_model.lora_te_encoder_block_9_layer_1_DenseReluDense_wi_1.weight
NOT LOADED diffusion_model.lora_te_encoder_block_9_layer_1_DenseReluDense_wo.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_0_attn1_to_k.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_0_attn1_to_out_0.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_0_attn1_to_q.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_0_attn1_to_v.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_0_attn2_to_k.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_0_attn2_to_out_0.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_0_attn2_to_q.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_0_attn2_to_v.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_10_attn1_to_k.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_10_attn1_to_out_0.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_10_attn1_to_q.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_10_attn1_to_v.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_10_attn2_to_k.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_10_attn2_to_out_0.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_10_attn2_to_q.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_10_attn2_to_v.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_11_attn1_to_k.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_11_attn1_to_out_0.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_11_attn1_to_q.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_11_attn1_to_v.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_11_attn2_to_k.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_11_attn2_to_out_0.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_11_attn2_to_q.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_11_attn2_to_v.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_12_attn1_to_k.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_12_attn1_to_out_0.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_12_attn1_to_q.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_12_attn1_to_v.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_12_attn2_to_k.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_12_attn2_to_out_0.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_12_attn2_to_q.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_12_attn2_to_v.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_13_attn1_to_k.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_13_attn1_to_out_0.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_13_attn1_to_q.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_13_attn1_to_v.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_13_attn2_to_k.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_13_attn2_to_out_0.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_13_attn2_to_q.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_13_attn2_to_v.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_14_attn1_to_k.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_14_attn1_to_out_0.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_14_attn1_to_q.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_14_attn1_to_v.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_14_attn2_to_k.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_14_attn2_to_out_0.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_14_attn2_to_q.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_14_attn2_to_v.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_15_attn1_to_k.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_15_attn1_to_out_0.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_15_attn1_to_q.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_15_attn1_to_v.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_15_attn2_to_k.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_15_attn2_to_out_0.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_15_attn2_to_q.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_15_attn2_to_v.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_16_attn1_to_k.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_16_attn1_to_out_0.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_16_attn1_to_q.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_16_attn1_to_v.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_16_attn2_to_k.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_16_attn2_to_out_0.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_16_attn2_to_q.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_16_attn2_to_v.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_17_attn1_to_k.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_17_attn1_to_out_0.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_17_attn1_to_q.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_17_attn1_to_v.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_17_attn2_to_k.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_17_attn2_to_out_0.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_17_attn2_to_q.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_17_attn2_to_v.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_18_attn1_to_k.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_18_attn1_to_out_0.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_18_attn1_to_q.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_18_attn1_to_v.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_18_attn2_to_k.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_18_attn2_to_out_0.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_18_attn2_to_q.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_18_attn2_to_v.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_19_attn1_to_k.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_19_attn1_to_out_0.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_19_attn1_to_q.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_19_attn1_to_v.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_19_attn2_to_k.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_19_attn2_to_out_0.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_19_attn2_to_q.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_19_attn2_to_v.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_1_attn1_to_k.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_1_attn1_to_out_0.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_1_attn1_to_q.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_1_attn1_to_v.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_1_attn2_to_k.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_1_attn2_to_out_0.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_1_attn2_to_q.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_1_attn2_to_v.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_20_attn1_to_k.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_20_attn1_to_out_0.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_20_attn1_to_q.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_20_attn1_to_v.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_20_attn2_to_k.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_20_attn2_to_out_0.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_20_attn2_to_q.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_20_attn2_to_v.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_21_attn1_to_k.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_21_attn1_to_out_0.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_21_attn1_to_q.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_21_attn1_to_v.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_21_attn2_to_k.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_21_attn2_to_out_0.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_21_attn2_to_q.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_21_attn2_to_v.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_22_attn1_to_k.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_22_attn1_to_out_0.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_22_attn1_to_q.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_22_attn1_to_v.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_22_attn2_to_k.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_22_attn2_to_out_0.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_22_attn2_to_q.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_22_attn2_to_v.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_23_attn1_to_k.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_23_attn1_to_out_0.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_23_attn1_to_q.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_23_attn1_to_v.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_23_attn2_to_k.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_23_attn2_to_out_0.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_23_attn2_to_q.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_23_attn2_to_v.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_24_attn1_to_k.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_24_attn1_to_out_0.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_24_attn1_to_q.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_24_attn1_to_v.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_24_attn2_to_k.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_24_attn2_to_out_0.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_24_attn2_to_q.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_24_attn2_to_v.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_25_attn1_to_k.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_25_attn1_to_out_0.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_25_attn1_to_q.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_25_attn1_to_v.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_25_attn2_to_k.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_25_attn2_to_out_0.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_25_attn2_to_q.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_25_attn2_to_v.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_26_attn1_to_k.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_26_attn1_to_out_0.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_26_attn1_to_q.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_26_attn1_to_v.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_26_attn2_to_k.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_26_attn2_to_out_0.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_26_attn2_to_q.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_26_attn2_to_v.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_27_attn1_to_k.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_27_attn1_to_out_0.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_27_attn1_to_q.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_27_attn1_to_v.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_27_attn2_to_k.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_27_attn2_to_out_0.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_27_attn2_to_q.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_27_attn2_to_v.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_2_attn1_to_k.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_2_attn1_to_out_0.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_2_attn1_to_q.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_2_attn1_to_v.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_2_attn2_to_k.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_2_attn2_to_out_0.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_2_attn2_to_q.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_2_attn2_to_v.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_3_attn1_to_k.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_3_attn1_to_out_0.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_3_attn1_to_q.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_3_attn1_to_v.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_3_attn2_to_k.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_3_attn2_to_out_0.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_3_attn2_to_q.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_3_attn2_to_v.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_4_attn1_to_k.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_4_attn1_to_out_0.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_4_attn1_to_q.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_4_attn1_to_v.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_4_attn2_to_k.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_4_attn2_to_out_0.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_4_attn2_to_q.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_4_attn2_to_v.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_5_attn1_to_k.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_5_attn1_to_out_0.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_5_attn1_to_q.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_5_attn1_to_v.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_5_attn2_to_k.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_5_attn2_to_out_0.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_5_attn2_to_q.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_5_attn2_to_v.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_6_attn1_to_k.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_6_attn1_to_out_0.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_6_attn1_to_q.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_6_attn1_to_v.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_6_attn2_to_k.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_6_attn2_to_out_0.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_6_attn2_to_q.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_6_attn2_to_v.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_7_attn1_to_k.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_7_attn1_to_out_0.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_7_attn1_to_q.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_7_attn1_to_v.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_7_attn2_to_k.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_7_attn2_to_out_0.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_7_attn2_to_q.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_7_attn2_to_v.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_8_attn1_to_k.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_8_attn1_to_out_0.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_8_attn1_to_q.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_8_attn1_to_v.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_8_attn2_to_k.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_8_attn2_to_out_0.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_8_attn2_to_q.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_8_attn2_to_v.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_9_attn1_to_k.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_9_attn1_to_out_0.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_9_attn1_to_q.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_9_attn1_to_v.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_9_attn2_to_k.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_9_attn2_to_out_0.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_9_attn2_to_q.weight
NOT LOADED diffusion_model.lora_transformer_transformer_blocks_9_attn2_to_v.weight```

city96 added a commit that referenced this issue Jun 11, 2024
@city96
Copy link
Owner

city96 commented Jun 11, 2024

No clue if that fixes it but I made it match the correct keys. It still has a bunch of "missing keys" listed but looks like OneTrained just doesn't train those?

image

Also, I can't really realistically load the T5 part until I switch over to the SD3 code for that, so I'd turn off training the text encoder until then.

@chrish-slingshot
Copy link
Author

Yeah I've not been training the text encoder as it's too big.

I'm afraid I get an error now running that example workflow I attached:


'EXM_PixArt_ModelPatcher' object has no attribute 'model_keys'

File "S:\StableDiffusion\UI\ComfyUI_2024_06_06\ComfyUI\execution.py", line 151, in recursive_execute
output_data, output_ui = get_output_data(obj, input_data_all)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "S:\StableDiffusion\UI\ComfyUI_2024_06_06\ComfyUI\execution.py", line 81, in get_output_data
return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "S:\StableDiffusion\UI\ComfyUI_2024_06_06\ComfyUI\execution.py", line 74, in map_node_over_list
results.append(getattr(obj, func)(**slice_dict(input_data_all, i)))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "S:\StableDiffusion\UI\ComfyUI_2024_06_06\ComfyUI\custom_nodes\ComfyUI_ExtraModels\PixArt\nodes.py", line 91, in load_lora
model_lora = load_pixart_lora(model, lora, lora_path, strength,)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "S:\StableDiffusion\UI\ComfyUI_2024_06_06\ComfyUI\custom_nodes\ComfyUI_ExtraModels\PixArt\lora.py", line 137, in load_pixart_lora
k = new_modelpatcher.add_patches(loaded, strength)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "S:\StableDiffusion\UI\ComfyUI_2024_06_06\ComfyUI\comfy\model_patcher.py", line 214, in add_patches
if k in self.model_keys:
^^^^^^^^^^^^^^^```

@chrish-slingshot
Copy link
Author

chrish-slingshot commented Jun 12, 2024

EDIT: Removing the contents of this post as it turns out it was unrelated. Above issue still stands.

@ejektaflex
Copy link

I have a similar issue, using a LoRa trained with OneTrainer. Here is what I get when I try to use it:
image

My LoRa is a bit borked so i cannot confirm whether it is working, but it's odd that this message shows up. Might still be working fine?

@frutiemax92
Copy link
Contributor

I didn't see this issue also, here is a lora file to test this.
https://www.dropbox.com/scl/fi/30y9yn26ao8pnwch7z1ex/test_lora.zip?rlkey=r6kvgzwvrqm9tnw4jz8ctgu2f&st=zrnmlw0n&dl=0

@boricuapab
Copy link

I trained a pixart sigma 512 MS one which isn't influencing the generation in comfy, but I tested sampling it in one trainer and it is working there

Without lora (OT)

pixartSigmaXL2512MS_loraoff

With (OT)

pixartSigmaXL2512MS_loraon

Without lora (comfy)

comfyLoraOff

With (comfy) (no change in generation)

comfyLoraOnNoEffect

comfy cli

comfyNoEffectCli

@frutiemax92
Copy link
Contributor

Did you upgrade your extra nodes recently?

@ejektaflex
Copy link

I have, I still cannot get a PixArt LoRa to have any meaningful effect like it did in OneTrainer.

@chrisgoringe
Copy link

Same issue here. Looks like none of the keys in the lora are correctly being matched.

Dug a little - among other things, the helper methods get_depth and get_lora_depth both returning zero.

@city96
Copy link
Owner

city96 commented Jul 27, 2024

The amount of convoluted jank and technical debt in this repo is staggering, and I should really just rewrite the entire LoRA loading logic instead of having half the key conversion be hardcoded.

Anyway, could you try it now? Pushed a fix.

image

@chrisgoringe
Copy link

chrisgoringe commented Jul 27, 2024

Will try it later today. Thanks!

If you want a collaborator to help with refactoring the code, lmk

@boricuapab
Copy link

boricuapab commented Jul 30, 2024

I've switched over to training diffusers loras for the sigma line of models using the sigma lora repo, here's an example of one for sigma 900M which is working inside of comfy

https://civitai.com/models/610726/pocket-creatures-sigma-900m

@chrish-slingshot
Copy link
Author

@boricuapab Nice! Would you mind sharing your training config?

@boricuapab
Copy link

boricuapab commented Jul 30, 2024

I don't have a one trainer training config for it, I'm training them using this repo

https://github.com/PixArt-alpha/PixArt-sigma

@frutiemax92
Copy link
Contributor

I've also trained this dreambooth lora with this script:
https://github.com/PixArt-alpha/PixArt-sigma/blob/master/train_scripts/train_dreambooth_lora.py

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants