Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory leaking when computing gradients #249

Closed
jnsbck opened this issue Oct 5, 2023 · 4 comments
Closed

Memory leaking when computing gradients #249

jnsbck opened this issue Oct 5, 2023 · 4 comments
Labels
bug Something isn't working

Comments

@jnsbck
Copy link

jnsbck commented Oct 5, 2023

Hey,

I encountered some weird issue, where memory seamingly leaks during ForwardDiff.gradient(ek1_loss). I was able to boil it down to the snippet below:

using ComponentArrays
using ForwardDiff
using ProbNumDiffEq
using BenchmarkTools

function get_lv_prob(tspan=(0.0, 10.0))
    function lotka_volterra(du, u, p, t)
        du[1] = p[1] * u[1] - p[2] * u[1] * u[2]
        du[2] = -p[3] * u[2] + p[4] * u[1] * u[2]
    end
    p = [1.5, 1.0, 3.0, 1.0]
    u0 = [1.0, 1.0]
    prob = ODEProblem(lotka_volterra, u0, tspan, p)
    return prob
end

prob = get_lv_prob((0.0, 17.0)) # no leak
prob = get_lv_prob((0.0, 20.0)) # slowly starts leaking
prob = get_lv_prob((0.0, 100.0)) # continues to leak

function ek1_loss(p, prob)
    # leaks for both EK0 and EK1
    sol = solve(prob, EK1(smooth=false), dense=false, p=p) # leak independet of order or dt
    l = sum(abs2, Array(sol)) / length(sol)
    return l
end

function test(; prob=prob)
    p = ComponentArray(prob.p)
    total_mem = Sys.total_memory() / 2^20
    i = 0
    while true
        i += 1
        ForwardDiff.gradient(p -> ek1_loss(p, prob), p)
        # GC.gc() # avoid memory leak (HOTFIX)
        free_mem = Sys.free_memory() / 2^20
        mem_usage = round((total_mem - free_mem) / total_mem * 100)
        print("\r Iteration $i, memory usage: $mem_usage%")
    end
end

test()
  • I am on the latest ProbNumDiffEq v0.12.1.
  • Not happening with solvers in OrdinaryDIffEq.
  • Not happening with small tspans for some reason.
  • Calling the garbage collector "fixes" the issue.
  • Leaking happens independent of solver order, stepsize or whether EK0 or EK1 is used

Lemme know in case you need any additional info. Thanks for looking into this.

@jnsbck jnsbck changed the title Memory leaking when computing gradients through EK1 solve Memory leaking when computing gradients Oct 6, 2023
@nathanaelbosch
Copy link
Owner

Thanks for opening this issue! I don't really have an idea where this is coming from, I am not familiar enough with the Julia internals and garbage collection. But I am surprised that manually calling GC.gc() fixes this. Either way, this should not be happening, and I will try to find out more to get this fixed.

@nathanaelbosch nathanaelbosch added the bug Something isn't working label Nov 15, 2023
@nathanaelbosch
Copy link
Owner

I just ran this code again and the memory usage des not seem to increase, so I will close this issue for now. But if I somehow missed something and this still persists, please do reopen the issue.

@jnsbck
Copy link
Author

jnsbck commented Feb 12, 2024

Just tested again and for me the issue still persists. I am on the lastest package versions

  [6e4b80f9] BenchmarkTools v1.4.0
  [b0b7db55] ComponentArrays v0.15.8
  [f6369f11] ForwardDiff v0.10.36
  [bf3e78b0] ProbNumDiffEq v0.15.0

EDIT: cannot re-open since you closed it and I am not a collaborator on this repo.

@nathanaelbosch
Copy link
Owner

nathanaelbosch commented Feb 13, 2024

Ok so this seems to be an issue with julia 1.9, but with 1.10 I can't reproduce this.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants