You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The performance of generated models is related to your training data that is training using SGD.
Would you provide more details like original model performance? The SGD-trained model performance may be affected by many factors, i.e., random seed, GPU type, so the 76.09% maybe reasonable results.
The train_p_diff.log contains terminal output about experiment initialization and torch.distributed.nn. This does not affect the conduct of the experiment.
If still have any questions or concerns, feel free to contact us.
You mentioned, "The train_p_diff.log contains terminal output about experiment initialization and torch.distributed.nn. This does not affect the conduct of the experiment."
However, after running your code, I found that the log file is empty. Everything is shown in the terminal but is not saved to the local log file.
Hi authors:
after running cifar100_resnet18_k200.sh, I found that the results is 76.09%, but what reported in the paper is higher than it.
Furthermore, I found that the logger file "./outputs/cifar100/train_p_diff.log" is empy.
Could you please help with this issue?
Best
The text was updated successfully, but these errors were encountered: