Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unit tests fail on QuadraticModels with ADNLPModels 0.8 #255

Closed
tmigot opened this issue Jun 24, 2024 · 9 comments
Closed

Unit tests fail on QuadraticModels with ADNLPModels 0.8 #255

tmigot opened this issue Jun 24, 2024 · 9 comments
Labels
bug Something isn't working

Comments

@tmigot
Copy link
Member

tmigot commented Jun 24, 2024

using LinearAlgebra, ADNLPModels, NLPModels, NLPModelsTest
nlp = BROWNDEN()
x = nlp.meta.x0
fx, gx, Hx = obj(nlp, x), grad(nlp, x), hess(nlp, x)
ADNLPModel(
  s -> fx + dot(gx, s) + dot(s, Hx * s) / 2,
  zeros(nlp.meta.nvar),
  nlp.meta.lvar - x,
  nlp.meta.uvar - x,
)

and the error is

ERROR: MethodError: no method matching !(::SparseConnectivityTracer.HessianTracer{BitSet, Set{Tuple{Int64, Int64}}})

Closest candidates are:
  !(::Missing)
   @ Base missing.jl:101
  !(::Bool)
   @ Base bool.jl:35
  !(::ComposedFunction{typeof(!)})
   @ Base operators.jl:1099
  ...

Stacktrace:
  [1] !=(x::SparseConnectivityTracer.HessianTracer{BitSet, Set{…}}, y::SparseConnectivityTracer.HessianTracer{BitSet, Set{…}})
    @ Base .\operators.jl:276
  [2] _mul!(nzrang::typeof(SparseArrays.nzrangelo), diagop::typeof(identity), odiagop::typeof(transpose), C::Vector{…}, A::SparseArrays.SparseMatrixCSC{…}, B::Vector{…}, α::SparseConnectivityTracer.HessianTracer{…}, β::SparseConnectivityTracer.HessianTracer{…})
    @ SparseArrays \AppData\Local\Programs\julia-1.10.2\share\julia\stdlib\v1.10\SparseArrays\src\linalg.jl:881
  [3] spdensemul!(C::Vector{…}, tA::Char, tB::Char, A::SparseArrays.SparseMatrixCSC{…}, B::Vector{…}, _add::LinearAlgebra.MulAddMul{…})
    @ SparseArrays \AppData\Local\Programs\julia-1.10.2\share\julia\stdlib\v1.10\SparseArrays\src\linalg.jl:50
  [4] generic_matvecmul!
    @ \AppData\Local\Programs\julia-1.10.2\share\julia\stdlib\v1.10\SparseArrays\src\linalg.jl:35 [inlined]
  [5] mul!
    @ \AppData\Local\Programs\julia-1.10.2\share\julia\stdlib\v1.10\LinearAlgebra\src\matmul.jl:66 [inlined]
  [6] mul!
    @ \AppData\Local\Programs\julia-1.10.2\share\julia\stdlib\v1.10\LinearAlgebra\src\matmul.jl:237 [inlined]
  [7] *(A::Symmetric{Float64, SparseArrays.SparseMatrixCSC{…}}, x::Vector{SparseConnectivityTracer.HessianTracer{…}})
    @ LinearAlgebra \AppData\Local\Programs\julia-1.10.2\share\julia\stdlib\v1.10\LinearAlgebra\src\matmul.jl:57
  [8] (::var"#5#6")(s::Vector{SparseConnectivityTracer.HessianTracer{BitSet, Set{Tuple{Int64, Int64}}}})   
    @ Main .\REPL[19]:2
  [9] (::ADNLPModels.var"#lagrangian#55"{…})(x::Vector{…})
    @ ADNLPModels \.julia\packages\ADNLPModels\ZWqLN\src\sparsity_pattern.jl:42
 [10] trace_function(::Type{…}, f::ADNLPModels.var"#lagrangian#55"{…}, x::Vector{…})
    @ SparseConnectivityTracer \.julia\packages\SparseConnectivityTracer\QlV0S\src\pattern.jl:32
 [11] hessian_pattern(f::Function, x::Vector{Float64}, ::Type{BitSet}, ::Type{Set{Tuple{Int64, Int64}}})   
    @ SparseConnectivityTracer \.julia\packages\SparseConnectivityTracer\QlV0S\src\pattern.jl:326
 [12] hessian_sparsity
    @ \.julia\packages\SparseConnectivityTracer\QlV0S\src\adtypes.jl:50 [inlined]
 [13] compute_hessian_sparsity(f::var"#5#6", nvar::Int64, c!::ADNLPModels.var"#2#3", ncon::Int64; detector::SparseConnectivityTracer.TracerSparsityDetector{…})
    @ ADNLPModels \.julia\packages\ADNLPModels\ZWqLN\src\sparsity_pattern.jl:51
 [14] ADNLPModels.SparseADHessian(nvar::Int64, f::Function, ncon::Int64, c!::ADNLPModels.var"#2#3"; x0::Vector{…}, coloring::SparseMatrixColorings.GreedyColoringAlgorithm{…}, detector::SparseConnectivityTracer.TracerSparsityDetector{…}, kwargs::@Kwargs{})
    @ ADNLPModels \.julia\packages\ADNLPModels\ZWqLN\src\sparse_hessian.jl:28
 [15] macro expansion
    @ \.julia\packages\ADNLPModels\ZWqLN\src\ad.jl:90 [inlined]
 [16] macro expansion
    @ .\timing.jl:395 [inlined]
 [17] ADNLPModels.ADModelBackend(nvar::Int64, f::var"#5#6"; backend::Symbol, matrix_free::Bool, show_time::Bool, gradient_backend::Type, hprod_backend::Type, hessian_backend::Type, kwargs::@Kwargs{…})
    @ ADNLPModels \.julia\packages\ADNLPModels\ZWqLN\src\ad.jl:86
 [18] ADNLPModel(f::Function, x0::Vector{…}, lvar::Vector{…}, uvar::Vector{…}; name::String, minimize::Bool, kwargs::@Kwargs{})
    @ ADNLPModels \.julia\packages\ADNLPModels\ZWqLN\src\nlp.jl:146
 [19] ADNLPModel(f::Function, x0::Vector{Float64}, lvar::Vector{Float64}, uvar::Vector{Float64})
    @ ADNLPModels \.julia\packages\ADNLPModels\ZWqLN\src\nlp.jl:133
@amontoison
Copy link
Member

amontoison commented Jun 25, 2024

I will add QuadraticModels.jl in the breakage tests of ADNLPModels.jl.
Update: We already have it here 🤔

@tmigot
Copy link
Member Author

tmigot commented Aug 6, 2024

@gdalle any idea if this is supposed to work with SparseConnectivityTracer 0.6?

@gdalle
Copy link
Collaborator

gdalle commented Aug 6, 2024

The surface-level issue is that we're missing an overload for !, which is easily fixed. The harder issue is that once I fix this, I run into matrix multiplication issues:

```julia
julia> ADNLPModel(
         s -> fx + dot(gx, s) + dot(s, Hx * s) / 2,
         zeros(nlp.meta.nvar),
         nlp.meta.lvar - x,
         nlp.meta.uvar - x,
       )
ERROR: TypeError: non-boolean (SparseConnectivityTracer.HessianTracer{SparseConnectivityTracer.IndexSetHessianPattern{Int64, BitSet, Set{Tuple{Int64, Int64}}, false}}) used in boolean context
Stacktrace:
  [1] _mul!(nzrang::typeof(SparseArrays.nzrangelo), diagop::typeof(identity), odiagop::typeof(transpose), C::Vector{…}, A::SparseArrays.SparseMatrixCSC{…}, B::Vector{…}, α::SparseConnectivityTracer.HessianTracer{…}, β::SparseConnectivityTracer.HessianTracer{…})
    @ SparseArrays ~/.julia/juliaup/julia-1.10.4+0.x64.linux.gnu/share/julia/stdlib/v1.10/SparseArrays/src/linalg.jl:881
  [2] spdensemul!(C::Vector{…}, tA::Char, tB::Char, A::SparseArrays.SparseMatrixCSC{…}, B::Vector{…}, _add::LinearAlgebra.MulAddMul{…})
    @ SparseArrays ~/.julia/juliaup/julia-1.10.4+0.x64.linux.gnu/share/julia/stdlib/v1.10/SparseArrays/src/linalg.jl:50
  [3] generic_matvecmul!
    @ ~/.julia/juliaup/julia-1.10.4+0.x64.linux.gnu/share/julia/stdlib/v1.10/SparseArrays/src/linalg.jl:35 [inlined]
  [4] mul!
    @ ~/.julia/juliaup/julia-1.10.4+0.x64.linux.gnu/share/julia/stdlib/v1.10/LinearAlgebra/src/matmul.jl:66 [inlined]
  [5] mul!
    @ ~/.julia/juliaup/julia-1.10.4+0.x64.linux.gnu/share/julia/stdlib/v1.10/LinearAlgebra/src/matmul.jl:237 [inlined]
  [6] *(A::Symmetric{Float64, SparseArrays.SparseMatrixCSC{…}}, x::Vector{SparseConnectivityTracer.HessianTracer{…}})
    @ LinearAlgebra ~/.julia/juliaup/julia-1.10.4+0.x64.linux.gnu/share/julia/stdlib/v1.10/LinearAlgebra/src/matmul.jl:57
  [7] (::var"#13#14")(s::Vector{SparseConnectivityTracer.HessianTracer{SparseConnectivityTracer.IndexSetHessianPattern{…}}})
    @ Main ./REPL[7]:2
  [8] (::ADNLPModels.var"#lagrangian#55"{…})(x::Vector{…})
    @ ADNLPModels ~/.julia/packages/ADNLPModels/vcIV9/src/sparsity_pattern.jl:42
  [9] trace_function(::Type{…}, f::ADNLPModels.var"#lagrangian#55"{…}, x::Vector{…})
    @ SparseConnectivityTracer ~/Work/GitHub/Julia/SparseConnectivityTracer.jl/src/interface.jl:33
 [10] _hessian_sparsity(f::Function, x::Vector{Float64}, ::Type{SparseConnectivityTracer.HessianTracer{…}})
    @ SparseConnectivityTracer ~/Work/GitHub/Julia/SparseConnectivityTracer.jl/src/interface.jl:120
 [11] hessian_sparsity
    @ ~/Work/GitHub/Julia/SparseConnectivityTracer.jl/src/adtypes.jl:57 [inlined]
 [12] compute_hessian_sparsity(f::var"#13#14", nvar::Int64, c!::ADNLPModels.var"#2#3", ncon::Int64; detector::SparseConnectivityTracer.TracerSparsityDetector{…})
    @ ADNLPModels ~/.julia/packages/ADNLPModels/vcIV9/src/sparsity_pattern.jl:51
 [13] ADNLPModels.SparseADHessian(nvar::Int64, f::Function, ncon::Int64, c!::ADNLPModels.var"#2#3"; x0::Vector{…}, coloring::SparseMatrixColorings.GreedyColoringAlgorithm{…}, detector::SparseConnectivityTracer.TracerSparsityDetector{…}, kwargs::@Kwargs{})
    @ ADNLPModels ~/.julia/packages/ADNLPModels/vcIV9/src/sparse_hessian.jl:29
 [14] macro expansion
    @ ~/.julia/packages/ADNLPModels/vcIV9/src/ad.jl:90 [inlined]
 [15] macro expansion
    @ ./timing.jl:395 [inlined]
 [16] ADNLPModels.ADModelBackend(nvar::Int64, f::var"#13#14"; backend::Symbol, matrix_free::Bool, show_time::Bool, gradient_backend::Type, hprod_backend::Type, hessian_backend::Type, kwargs::@Kwargs{…})
    @ ADNLPModels ~/.julia/packages/ADNLPModels/vcIV9/src/ad.jl:86
 [17] ADNLPModel(f::Function, x0::Vector{…}, lvar::Vector{…}, uvar::Vector{…}; name::String, minimize::Bool, kwargs::@Kwargs{})
    @ ADNLPModels ~/.julia/packages/ADNLPModels/vcIV9/src/nlp.jl:146
 [18] ADNLPModel(f::Function, x0::Vector{Float64}, lvar::Vector{Float64}, uvar::Vector{Float64})
    @ ADNLPModels ~/.julia/packages/ADNLPModels/vcIV9/src/nlp.jl:133
 [19] top-level scope
    @ REPL[7]:1
Some type information was truncated. Use `show(err)` to see complete types.

Since this particular method of mul! in SparseArrays involves a conditional, we can't trace through it with global tracers.
The right issue to follow is adrhill/SparseConnectivityTracer.jl#133, because while we have implemented some array-level functions, multiplication is still missing.

@gdalle
Copy link
Collaborator

gdalle commented Aug 6, 2024

The immediate workaround is always the same, use TracerLocalSparsityDetector() if you're 100% sure that the sparsity pattern doesn't depend on the input values

@gdalle
Copy link
Collaborator

gdalle commented Aug 6, 2024

This should probably be documented in ADNLPModels, similar to #283

@tmigot
Copy link
Member Author

tmigot commented Aug 6, 2024

Thanks @gdalle ! I will use TracerLocalSparsityDetector() as a workaround as it is for quadratic problems so the sparsity pattern won't change.

@tmigot
Copy link
Member Author

tmigot commented Aug 6, 2024

@gdalle Is it faster to use TracerLocalSparsityDetector() than the default if we know the sparsity pattern won't change?

@gdalle
Copy link
Collaborator

gdalle commented Aug 6, 2024

It shouldn't be, precisely because the sparsity pattern doesn't change so you should get the same number of colors and so on. Detection might be a bit slower in local mode but it's probably negligible.
However let me stress the importance of running the detection with a randomized input vector and not e.g. a vector full of zeros, which can give rise to accidental zeros in the output (which would be non zero at another input). See the docstring of the local sparsity detector for an example

@tmigot
Copy link
Member Author

tmigot commented Aug 6, 2024

Ok, just wanted to make sure. Depending on how frequent these issues will be, I am thinking about adding a kwarg to the ADNLPModel/ADNLSModel to mention this.
This would avoid having to manually add the detector and may be more user friendly

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants