Skip to content

Commit

Permalink
PEFT modules resolution: corrected warning
Browse files Browse the repository at this point in the history
  • Loading branch information
stefanik12 committed Apr 18, 2024
1 parent 53979f2 commit c7f35e2
Showing 1 changed file with 1 addition and 2 deletions.
3 changes: 1 addition & 2 deletions adaptor/lang_module.py
Original file line number Diff line number Diff line change
Expand Up @@ -87,8 +87,7 @@ def load_head(self,
else:
PeftModelCls = HEAD_TO_MODEL_CLS[head_type]["peft"]
# if that fails, trying to load as a PEFT model
logger.warning("Loading model_name_or_path='%s' as full transformer failed. "
"Attempting to load it as peft model.", model_name_or_path)
logger.warning("Loading model_name_or_path='%s' as peft model.", model_name_or_path)
# base model resolution
# we avoid reloading the base model separately for each lora module

Expand Down

0 comments on commit c7f35e2

Please sign in to comment.