-
Notifications
You must be signed in to change notification settings - Fork 236
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
configs/training.json doesn't work #62
Comments
do you change |
yes,
LaTeX_OCR_PRO/configs/training.json
This file seems to be useless
…--
At 2022-02-28 10:40:42, "兮尘" ***@***.***> wrote:
do you change lr_init? lr_init is the initial learning rate, while CosineAnnealingLR is learning rate scheduler
—
Reply to this email directly, view it on GitHub, or unsubscribe.
Triage notifications on the go with GitHub Mobile for iOS or Android.
You are receiving this because you authored the thread.Message ID: ***@***.***>
|
please refer to |
I feel particularly complicated, how to use pytorch's lr_scheduler.MultiplicativeLR for training? thanks!!!
…--
At 2022-03-01 15:31:00, "兮尘" ***@***.***> wrote:
please refer to model/utils/lr_schedule.py, which defines object LRSchedule.
Warming-up (lr_warm, end_warm) and decay (start_decay, end_decay) are taken to schedule the learning rate.
the learning rate will be lr_init only when the epoch has end_warm < epoch < start_decay
—
Reply to this email directly, view it on GitHub, or unsubscribe.
Triage notifications on the go with GitHub Mobile for iOS or Android.
You are receiving this because you authored the thread.Message ID: ***@***.***>
|
you can refer to https://pytorch.org/docs/stable/generated/torch.optim.lr_scheduler.MultiplicativeLR.html for pytorch's MultiplicativeLR. |
emmm... |
configs/training.json is useless, changing the learning rate in it does not work at all, the learning rate I have been using is CosineAnnealingLR
The text was updated successfully, but these errors were encountered: