Skip to content

Commit

Permalink
Fix initial_lr when resuming training huggingface#243
Browse files Browse the repository at this point in the history
  • Loading branch information
gritukan committed Nov 28, 2024
1 parent d61d1c8 commit a8f7542
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion src/nanotron/helpers.py
Original file line number Diff line number Diff line change
Expand Up @@ -165,7 +165,7 @@ def get_lr_lambda_for_param_group(lr: float):
# NOTE: get learning rate scheduler for each param group
lr_lambdas = []
for param_group in optimizer.get_base_optimizer().param_groups:
lr_lambdas.append(get_lr_lambda_for_param_group(lr=param_group["lr"]))
lr_lambdas.append(get_lr_lambda_for_param_group(lr=lr_scheduler_args.learning_rate))

assert len(lr_lambdas) == len(
optimizer.get_base_optimizer().param_groups
Expand Down

0 comments on commit a8f7542

Please sign in to comment.