Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add FLUX Control LoRA weight param #7452

Merged
merged 5 commits into from
Dec 17, 2024
Merged

Conversation

RyanJDick
Copy link
Collaborator

Summary

Add the ability to control the weight of a FLUX Control LoRA.

Example

Original image:

Image 1

Prompt: a scarecrow playing tennis
Weights: 0.4, 0.6, 0.8, 1.0

Image 1 Image 2
Image 1 Image 2

QA Instructions

  • weight control changes strength of control image
  • Test that results match across both quantized and non-quantized.

Merge Plan

Do not merge this PR yet.

  1. Merge Support FLUX structural control LoRA models #7450
  2. Merge LoRA refactor to enable FLUX control LoRAs w/ quantized tranformers #7446
  3. Change target branch to main
  4. Merge this branch.

Checklist

  • The PR has a short but descriptive title, suitable for a changelog
  • Tests added / updated (if applicable)
  • Documentation added / updated (if applicable)
  • Updated What's New copy (if doing a release after this PR)

@github-actions github-actions bot added python PRs that change python files invocations PRs that change invocations backend PRs that change backend files frontend PRs that change frontend files labels Dec 17, 2024
@RyanJDick
Copy link
Collaborator Author

I think there is some additional metadata handling needed for the weight param, which I have neglected. @maryhipp or @psychedelicious can you help with that (and any other frontend stuff I may have missed).

@psychedelicious psychedelicious force-pushed the ryan/flux-control-lora-weight branch from e3b2f31 to 3f57b7a Compare December 17, 2024 09:01
@psychedelicious
Copy link
Collaborator

Because the weight setting is part of an individual canvas layer's state, it gets recalled along with the layer. No other changes required for that to work thankfully.

@RyanJDick RyanJDick force-pushed the ryan/lora-refactor-full branch from c407a25 to dd09509 Compare December 17, 2024 13:21
Base automatically changed from ryan/lora-refactor-full to main December 17, 2024 13:30
@RyanJDick RyanJDick force-pushed the ryan/flux-control-lora-weight branch from f43f930 to d764aa4 Compare December 17, 2024 13:36
@RyanJDick RyanJDick enabled auto-merge December 17, 2024 13:39
@RyanJDick RyanJDick merged commit 594511c into main Dec 17, 2024
14 checks passed
@RyanJDick RyanJDick deleted the ryan/flux-control-lora-weight branch December 17, 2024 13:46
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
backend PRs that change backend files frontend PRs that change frontend files invocations PRs that change invocations python PRs that change python files
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants