Replies: 2 comments 2 replies
-
You can use any custom optimizer (https://detectron2.readthedocs.io/en/latest/tutorials/training.html), so you can set param groups to whatever needed. detectron2/projects/Panoptic-DeepLab/train_net.py Lines 111 to 112 in c152862 |
Beta Was this translation helpful? Give feedback.
0 replies
-
Sorry to be dumb but where would you define the parameter groups and where would you set the corresponding learning rates? |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I'm just wondering if the ability to set different learning rates for parameter groups within the model is implemented in Detectron2; for example could you train three parameter groups where the bottom layers of the model are training slower than the top, with
lrs = np.array([lr/9, lr/3, lr]
?If I'm reading the code right, it looks like
build_optimizer()
andbuild_lr_scheduler()
both are both set up to use a single learning rate for all of the layers in a model, but then WarmupMultiStepLR and WarmupCosineLR both have a variable that holds a list of learning rates (self.base_lrs
).I may have missed it, but I don't see where you would set the parameter groups or the corresponding learning rates.
Thanks.
Beta Was this translation helpful? Give feedback.
All reactions