Optimizer with MultiTransform throws ValueError #481
varunagrawal
started this conversation in
General
Replies: 2 comments
-
NOTE: I got the idea for using the |
Beta Was this translation helpful? Give feedback.
0 replies
-
Hi @varunagrawal , I'm not familiar with |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I want to freeze parts of my network for training, so to do this, I modified the
ppo.train
function to accept an optimizer object.I then define the optimizer as
However, when the optimizer calls init as
optimizer.init(init_params)
when defining the TrainingState, it throws the following error:This does not happen when I define the optimizer as
op = optax.adam(3.0e-4)
. Since both variants areoptax.GradientTransformationExtraArgs
, can someone please explain what is happening and how can I resolve this?Here's the full stack trace:
Beta Was this translation helpful? Give feedback.
All reactions