Skip to content

Commit

Permalink
Fix "amsgrad" is used before being defined when initializing the Adam…
Browse files Browse the repository at this point in the history
…W optimizer (#660)

* Fix delayvar not correct in concat mode

* Update AdamW in optimizer.py

Fixed "amsgrad" is used before being defined.
  • Loading branch information
CloudyDory authored Mar 29, 2024
1 parent 87858c5 commit b06d80a
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion brainpy/_src/optimizers/optimizer.py
Original file line number Diff line number Diff line change
Expand Up @@ -901,6 +901,7 @@ def __init__(
amsgrad: bool = False,
name: Optional[str] = None,
):
self.amsgrad = amsgrad
super(AdamW, self).__init__(lr=lr,
train_vars=train_vars,
weight_decay=weight_decay,
Expand All @@ -919,7 +920,6 @@ def __init__(
self.beta2 = beta2
self.eps = eps
self.weight_decay = weight_decay
self.amsgrad = amsgrad

def __repr__(self):
return (f"{self.__class__.__name__}(lr={self.lr}, "
Expand Down

0 comments on commit b06d80a

Please sign in to comment.