Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

use NlOpt with GPU and Lux #253

Closed
jakubMitura14 opened this issue Jan 15, 2025 · 2 comments
Closed

use NlOpt with GPU and Lux #253

jakubMitura14 opened this issue Jan 15, 2025 · 2 comments

Comments

@jakubMitura14
Copy link

Hello I tried to use NLopt like below

using Optimization, Lux, Zygote, MLUtils, Statistics, Plots, Random, ComponentArrays
using CUDA, LuxCUDA,  OptimizationNLopt

x = Float32.(rand(10, 10))
y = Float32.(sin.(x))
dev = gpu_device()

# data = MLUtils.DataLoader((x, y), batchsize = 1)|> dev
# data = MLUtils.DataLoader((x, y), batchsize = 1)
data = (x, y)

# Define the neural network
model = Chain(Dense(10, 32, tanh), Dense(32, 1))
ps, st = Lux.setup(Random.default_rng(), model)
ps_ca = ComponentArray(ps)
ps_ca = ps_ca |> dev


smodel = StatefulLuxLayer{true}(model, nothing, st)

function callback(state, l)
    state.iter % 25 == 1 && @show "Iteration: %5d, Loss: %.6e\n" state.iter l
    return l < 1e-1 ## Terminate if loss is small
end

function loss(ps, data)
    print("\n dataaaaaaa $(typeof(data)) \n")
    d1 = CuArray(data[1])
    d2 = CuArray(data[2])
    # print("\n typeof(d1) $(typeof(d1)) typeof(d2) $(typeof(d1)) \n")
    ypred = smodel(d1, ps)
    return sum(abs2, ypred .- d2)
end

optf = OptimizationFunction(loss, AutoZygote())
prob = OptimizationProblem(optf, ps_ca, data, lb=ps_ca.*(100000000*(-1)), ub=ps_ca.*100000000)


res = Optimization.solve(prob, NLopt.LD_LBFGS())

but it give warning and give information that arguments are invalid (Hovewer still return something)

┌ Warning: NLopt failed to converge: INVALID_ARGS
└ @ OptimizationNLopt /usr/local/share/julia/packages/OptimizationNLopt/YE3fr/src/OptimizationNLopt.jl:299
retcode: Failure
u: 385-element Vector{Float64}:
 -0.001771742245182395
 -0.20413166284561157
  0.47901201248168945
  0.5232356786727905
  0.19946818053722382
  0.9051346778869629
  0.5221430063247681
  ⋮
 -0.01594085991382599
 -0.15956547856330872
  0.22767578065395355
  0.18943265080451965
  0.2767360210418701
 -0.09060057252645493
 -0.12697920203208923
@odow
Copy link
Member

odow commented Jan 16, 2025

Hi @jakubMitura14, I'm not sure what the cause is. I suggest you instead open an issue in https://github.com/SciML/Optimization.jl. This doesn't seem like a bug in NLopt.jl.

It would help to create a minimal reproducible example. Does this need the GPU? Does it need the lower and upper variable bounds? Does it need packages like Plots?

@odow odow closed this as completed Jan 16, 2025
@jakubMitura14
Copy link
Author

  1. To be honest I was creating it as minimal working example
  2. Yes it requires GPU
  3. no it do not need upper and lower
  4. indded it do not need i had added too mch packages - thanks

So what do you suggest here to change apart form removing unnecessary packages imports?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

No branches or pull requests

2 participants