Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Possible bad switch in optimizer #1

Open
loliverhennigh opened this issue Jul 10, 2019 · 1 comment
Open

Possible bad switch in optimizer #1

loliverhennigh opened this issue Jul 10, 2019 · 1 comment

Comments

@loliverhennigh
Copy link

Hey, I think there is an issue here in the code where the switch statement. It checks if t is greater the 1 however t is 1. because there is t = self.iterations + 1 above. For me this was returning d_t on the first step and causing the loss to diverge instantly. I set the value to 2 and everything works now. I am using this optimizer with tensorflow and not Keras so maybe that is messing something up too...

https://github.com/rooa/eve/blob/master/eve/optim/eve.py#L67

@jayanthkoushik
Copy link
Collaborator

Hi, self.iterations is initialized to 0 (https://github.com/rooa/eve/blob/master/eve/optim/eve.py#L27). So t should be 1 on the first step. Do you mind sharing how you are using the code?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants