-
Notifications
You must be signed in to change notification settings - Fork 16
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Memory leaking when computing gradients #249
Comments
Thanks for opening this issue! I don't really have an idea where this is coming from, I am not familiar enough with the Julia internals and garbage collection. But I am surprised that manually calling |
I just ran this code again and the memory usage des not seem to increase, so I will close this issue for now. But if I somehow missed something and this still persists, please do reopen the issue. |
Just tested again and for me the issue still persists. I am on the lastest package versions
EDIT: cannot re-open since you closed it and I am not a collaborator on this repo. |
Ok so this seems to be an issue with julia 1.9, but with 1.10 I can't reproduce this. |
Hey,
I encountered some weird issue, where memory seamingly leaks during
ForwardDiff.gradient(ek1_loss)
. I was able to boil it down to the snippet below:ProbNumDiffEq v0.12.1
.Lemme know in case you need any additional info. Thanks for looking into this.
The text was updated successfully, but these errors were encountered: