Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Jacobian-Free Krylov Versions for TR/LM/GN #282

Merged
merged 8 commits into from
Nov 22, 2023

Conversation

avik-pal
Copy link
Member

@avik-pal avik-pal commented Nov 13, 2023

Fixes #167

Copy link

codecov bot commented Nov 13, 2023

Codecov Report

Attention: 9 lines in your changes are missing coverage. Please review.

Comparison is base (0026bc1) 49.06% compared to head (bcfcc16) 94.04%.

Files Patch % Lines
src/jacobian.jl 91.52% 5 Missing ⚠️
src/linesearch.jl 72.72% 3 Missing ⚠️
src/NonlinearSolve.jl 0.00% 1 Missing ⚠️
Additional details and impacted files
@@             Coverage Diff             @@
##           master     #282       +/-   ##
===========================================
+ Coverage   49.06%   94.04%   +44.97%     
===========================================
  Files          19       20        +1     
  Lines        1824     1896       +72     
===========================================
+ Hits          895     1783      +888     
+ Misses        929      113      -816     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@avik-pal avik-pal marked this pull request as ready for review November 21, 2023 20:54
@avik-pal
Copy link
Member Author

@ChrisRackauckas this is good to go on my end

__init_JᵀJ(J::StaticArray) = MArray{Tuple{size(J, 2), size(J, 2)}, eltype(J)}(undef)
__init_JᵀJ(J::Number, args...; kwargs...) = zero(J), zero(J)
function __init_JᵀJ(J::AbstractArray, fu, args...; kwargs...)
JᵀJ = J' * J
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what is actually using this?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

None of the algorithms by default. But if LM/GN/TR is forced to use a Linear Solve which only works with square matrices then this needs to be triggered.

end
autodiff = __concrete_vjp_autodiff(vjp_autodiff, jvp_autodiff, uf)
Jᵀ = VecJac(uf, u; fu, autodiff)
JᵀJ_op = SciMLOperators.cache_operator(Jᵀ * J, u)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This operator shouldn't need to be constructed.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

without doing the cache thing, it complained that for in place operations we need to run set the cache (something along those lines)

src/jacobian.jl Outdated Show resolved Hide resolved
@ChrisRackauckas ChrisRackauckas merged commit 46912f2 into SciML:master Nov 22, 2023
11 checks passed
@avik-pal avik-pal deleted the ap/krylov branch November 22, 2023 18:40
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

TrustRegion is missing optimizations of using vjps directly
2 participants