Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DGMulti support for curved meshes and parabolic terms #1400

Merged
merged 41 commits into from
Apr 28, 2023

Conversation

jlchan
Copy link
Contributor

@jlchan jlchan commented Apr 17, 2023

This PR adds support for curved parabolic terms. The previous approach taken for parabolic terms was to

  1. compute the weak-form DG gradient and store it at nodal points
  2. interpolate the gradient to quadrature points, compute the viscous flux
  3. compute the weak-form DG divergence and store it at nodal points in du

Here, we switch to the following workflow

  1. compute the strong-form DG gradient and store it at quadrature points.
  2. compute/store the viscous flux at quadrature points (the only interpolation is to interpolate the solution to quadrature points)
  3. compute the weak-form DG divergence and store it at nodal points in du

This is mathematically equivalent to the former approach, but ensures that the resulting scheme is symmetric and entropy/energy dissipative even on curved meshes. It should also be slightly more efficient (fewer multiplications by interpolation matrices).

Unfortunately, because this involved rewriting much of the parabolic framework, it is a somewhat large PR - apologies in advance.

jlchan added 4 commits April 17, 2023 14:56
- it's not necessary for coercivity (BR1 is coercive with any penalty parameter > 0; a factor of 1/h is only necessary for IPDG) 
- simplifies the implementation for curved meshes
- it's not in CI tests yet (should be added soon)
@codecov
Copy link

codecov bot commented Apr 18, 2023

Codecov Report

Merging #1400 (a58580e) into main (f5e6f21) will decrease coverage by 4.78%.
The diff coverage is 97.94%.

@@            Coverage Diff             @@
##             main    #1400      +/-   ##
==========================================
- Coverage   95.97%   91.19%   -4.78%     
==========================================
  Files         351      353       +2     
  Lines       29122    29337     +215     
==========================================
- Hits        27949    26752    -1197     
- Misses       1173     2585    +1412     
Flag Coverage Δ
unittests 91.19% <97.94%> (-4.78%) ⬇️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
src/equations/laplace_diffusion_2d.jl 83.33% <0.00%> (ø)
src/solvers/dgmulti/dg_parabolic.jl 90.20% <95.33%> (+0.63%) ⬆️
...multi_2d/elixir_navierstokes_convergence_curved.jl 100.00% <100.00%> (ø)
...multi_3d/elixir_navierstokes_convergence_curved.jl 100.00% <100.00%> (ø)
src/solvers/dgmulti/dg.jl 94.12% <100.00%> (+0.02%) ⬆️

... and 45 files with indirect coverage changes

@jlchan jlchan marked this pull request as ready for review April 25, 2023 17:09
Copy link
Member

@ranocha ranocha left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks a lot for working on this, @jlchan! Please find below some comments after an initial review. It would be great to get some additional eyes on this.

src/equations/laplace_diffusion_2d.jl Outdated Show resolved Hide resolved
src/solvers/dgmulti/dg.jl Show resolved Hide resolved
- p * rho * rho_yy ) * mu_ )

return SVector(du1, du2, du3, du4)
end
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I just assume you copied this setup and did not review it properly (same in 3D).

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this referring to the type-unstable (indexing) comment above?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No - to the actual formulas of the manufactured solution

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's the exact same as the setup for elixir_navierstokes_convergence.jl. I can add a comment mentioning this?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Alternatively, I could just run elixir_navierstokes_convergence.jl and pass a new mesh through trixi_include instead of creating new elixirs.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nevermind, using trixi_include didn't work. I added some comments noting the origin of the manufactured solution.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Alternatively... shouldn't we add the NSE convergence test to the CompressibleNSEDiffusionXD types? IIRC, we do have a similar function for most other equation systems. Or was there an issue with that why this didn't work/why this is super awkward?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's fine from my side to keep it as it is - I just wanted to say that I didn't want to read it 😅

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@sloede we could. The only potential issue I see is that the source and initial condition require functions mu() and prandtl_number() to be defined (see https://github.com/jlchan/Trixi.jl/blob/a8c0933bd0077c45372c3937c40fce772e348886/examples/dgmulti_2d/elixir_navierstokes_convergence.jl#L56-L57). These functions are necessary due to the fact that initial conditions cannot specialize on only one equation type, since they are called both with equations::CompressibleEulerEquations2D and equations::CompressibleNavierStokesDiffusion2D in different places.

To me, moving the NSE convergence test functions out of the elixir makes these implicit requirements even less clear.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good point. Let's keep them as they are, please.

src/solvers/dgmulti/dg_parabolic.jl Outdated Show resolved Hide resolved
src/solvers/dgmulti/dg_parabolic.jl Outdated Show resolved Hide resolved
src/solvers/dgmulti/dg_parabolic.jl Show resolved Hide resolved
src/solvers/dgmulti/dg_parabolic.jl Show resolved Hide resolved
src/solvers/dgmulti/dg_parabolic.jl Outdated Show resolved Hide resolved
src/solvers/dgmulti/dg_parabolic.jl Outdated Show resolved Hide resolved
@jlchan
Copy link
Contributor Author

jlchan commented Apr 26, 2023

@ranocha any idea why Invalidations.yml keeps failing?

@ranocha
Copy link
Member

ranocha commented Apr 27, 2023

@ranocha any idea why Invalidations.yml keeps failing?

Invalidations CI run fails because of aviatesk/JET.jl#499. That's not critical right now.

src/solvers/dgmulti/dg_parabolic.jl Outdated Show resolved Hide resolved
src/solvers/dgmulti/dg_parabolic.jl Outdated Show resolved Hide resolved
src/solvers/dgmulti/dg_parabolic.jl Outdated Show resolved Hide resolved
@jlchan jlchan requested a review from ranocha April 28, 2023 03:14
@jlchan jlchan requested a review from ranocha April 28, 2023 14:32
Copy link
Member

@ranocha ranocha left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks a lot! This looks good to me.

@jlchan jlchan merged commit 458daa8 into trixi-framework:main Apr 28, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants