You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Sweet. For unloading, what I had in mind was for making the combination of using model.load_weights(..) followed by model.unload_weights(..) an idempotent operation. So, I can run LoRA on a model for a bit (making a new adapter), then, during loss calculations, run some evaluations comparing results from the original model vs. the adaptation without keeping a redundant copy of the original, and continue until the LoRA is completed.
Currently, adapters can be loaded with:
However, there is no way to either unload the weights:
or to swap in a new one dynamically:
This is useful for several use cases, including DPO loss calculation & Dynamically serving LoRA Adapters
The text was updated successfully, but these errors were encountered: