-
Notifications
You must be signed in to change notification settings - Fork 48
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
The results on ImageNet-C sensitive to some hyperparameters #4
Comments
Thanks for your interest in tent ⛺ !
The code for ImageNet-C will be included shortly, once we have finished simplifying it and re-running it. I'll follow up here and close this issue when it's pushed.
Please see page 5 of the paper at ICLR'21 for some details of the hyperparameters:
Could you tell us which specific hyperparameter/s you would like to know about? For the optimization settings, we have seen improvements with a variety of learning rates [0.00025, 0.01] with SGD+momentum or Adam. However, the amount of improvement can vary, and there are settings that hurt. We recommend selecting hyperparameters on the held-out "extra" corruptions (speckle, spatter, gaussian_blur, saturate). For the model, we have used the pre-trained ResNet-50 model from pycls as our baseline, as well as ResNet-50 models that we have trained ourselves. |
Hi! Could you please also share the code used for the segmentation experiments? Thanks! |
Hi! I'm a new student on test-time adaption and very fond of your work. Are you ready to share the code for ImageNet-C yet? Thanks! |
Hi, thanks for sharing your great work. And the current repository only contains example code to illustrate how tent works. I am wondering if you will share the code to exactly reproduce results on ImageNet-C or some implementation details if the code is not available. Because it seems that tent is very sensitive to choices of some hyperparameters.
The text was updated successfully, but these errors were encountered: