You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thank you for your contribution, but your pre-training weights file has only 49 batches. I want to know if you have used the learning rate attenuation strategy?
The text was updated successfully, but these errors were encountered:
You're welcome. Sorry for my late reply. I was busy in the past 3 months.
You probably meant weights.49.pth, which is the model weights at the 49th epoch. We apply the same learning rate attenuation strategy as the code by the original author. The model is updated with base learning rate in the first 100 epochs whilst the learning rate is halved in the last 100 epochs.
However, the whole training is too long. It took about 2 weeks to train under one setting on my machine. For the sack of time, we simply train each model for 49 epochs. Please do the training yourself to get the 200-epoched model. Yet, believe me, the performance is actually good enough with 49 epochs.
Thank you for your contribution, but your pre-training weights file has only 49 batches. I want to know if you have used the learning rate attenuation strategy?
The text was updated successfully, but these errors were encountered: