You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It's not uncommon to see this behavior in the output:
Forcing (precipitation or evaporation) changes abruptly (at t_i, t_i+1, t_i+2), then we do a forward fill. But Ribasim has to stomach the sudden change, and is thereby forced to create a lot more timesteps (red dots) to accurately simulate the effect of the shock.
But obviously, we don't care about the dynamics of the shock at all: it's just an artifact, a result of having the forcing as daily sums. We get sub-second (or worse) dynamics because the solver is trying its darnedest to represent the large jumps in the daily sums. That feels very disproportionate.
I've discussed this with @visr before, and @SouthEndMusic mentioned it now in relationship to #1918 and #1919: the forcing is VERY discontinuous (not even C0).
Smoothing forcing time series
It's plausible that we could substantially reduce the number of timesteps required by lessening the effects of these shocks. We've been hesitant to do so because it basically requires interpolation which makes the input difficult to interpret: interpolation might not conserve the total inflow, or it spreads it out in unforeseen ways.
(I reckon that interpolating for e.g. level boundaries is probably a lot less controversial.)
However, there might be an easier way out: make it an explicit optional pre-processing step. If we were to add a function e.g. to ribasim-python that takes pandas Series of precipitation and returns a (slightly) smoother one, I think that would be perfectly acceptable to most modelers (especially if you get a substantial run time reduction). You run the function, you plot the smoothed timeseries together with the original daily sums, and you check it's still close enough / preserves the thing you care about.
I think linear interpolation would already make a significant difference. Maybe the primary challenge is that making sure that the solver doesn't skip too much of the "up- and off-ramp"? To preserve cumulative inflow, it should interpolate for t + 0.5 * dt where dt is the proposed timestep (so it requires knowledge of dt next to t).
Coupler flows
The coupler flows are also forward filled. Unlike forcing, these cannot simply be pre-processed, nor can precipitation be (slightly) redistributed over time. We have access to a previous flow and the flow for the next period; however, we are more less free (?) to distribute the flow internal of the flow coupler timestep. Maybe a scheme like this would be somewhat effective:
I.e. you always end at the specified value.
It will never be quite perfect: in case of increasing flows, you can redistribute, but you cannot when a flow goes to zero. In that case, you have to make the jump suddenly (but currently we're doing that for every change).
I'm guessing that in most cases, you can redistribute flows over time. You have to select some intermediate t value and some degree of "over-" or "undershoot". I think in principle, you'd want to stick to the forward fill curve as closely as possible, which means the intermediate t lies close to the initial t; you could specify an allowed absolute allowed deviation which results in the intermediate; if the necessary redistribution is less than that, you simply select the intermediate t at half the coupler time step.
Concluding
I think you could achieve this with:
linear interpolation in Ribasim (which should be an option / switch on time series somewhere or something)
a pre-processing utility
an imod_coupler modification that calls ribasim's update twice instead of once (and uses the linear interpolation behind the scenes)
Notes
It becomes a lot more complicated if dt isn't available during interpolation...
This adds dynamic behavior instead since rather than having a piecewise constant over time flow, it becomes a piecewise linear over time flow. Currently, the solver has to stomach the shock, but afterwards it can try to achieve some sense of a steady-state (until the next shock). With linearly varying flows, that steady-state doesn't exist.
The text was updated successfully, but these errors were encountered:
I made some progress on this, see SciML/DataInterpolations.jl#364 (comment). I also came across the problem that $\Delta t$ has to be known, so we cannot do this for BMI supplied data, only for when the full time series is known.
It's not uncommon to see this behavior in the output:
Forcing (precipitation or evaporation) changes abruptly (at t_i, t_i+1, t_i+2), then we do a forward fill. But Ribasim has to stomach the sudden change, and is thereby forced to create a lot more timesteps (red dots) to accurately simulate the effect of the shock.
But obviously, we don't care about the dynamics of the shock at all: it's just an artifact, a result of having the forcing as daily sums. We get sub-second (or worse) dynamics because the solver is trying its darnedest to represent the large jumps in the daily sums. That feels very disproportionate.
I've discussed this with @visr before, and @SouthEndMusic mentioned it now in relationship to #1918 and #1919: the forcing is VERY discontinuous (not even C0).
Smoothing forcing time series
It's plausible that we could substantially reduce the number of timesteps required by lessening the effects of these shocks. We've been hesitant to do so because it basically requires interpolation which makes the input difficult to interpret: interpolation might not conserve the total inflow, or it spreads it out in unforeseen ways.
(I reckon that interpolating for e.g. level boundaries is probably a lot less controversial.)
However, there might be an easier way out: make it an explicit optional pre-processing step. If we were to add a function e.g. to ribasim-python that takes pandas Series of precipitation and returns a (slightly) smoother one, I think that would be perfectly acceptable to most modelers (especially if you get a substantial run time reduction). You run the function, you plot the smoothed timeseries together with the original daily sums, and you check it's still close enough / preserves the thing you care about.
I think linear interpolation would already make a significant difference. Maybe the primary challenge is that making sure that the solver doesn't skip too much of the "up- and off-ramp"? To preserve cumulative inflow, it should interpolate for
t + 0.5 * dt
wheredt
is the proposed timestep (so it requires knowledge ofdt
next tot
).Coupler flows
The coupler flows are also forward filled. Unlike forcing, these cannot simply be pre-processed, nor can precipitation be (slightly) redistributed over time. We have access to a previous flow and the flow for the next period; however, we are more less free (?) to distribute the flow internal of the flow coupler timestep. Maybe a scheme like this would be somewhat effective:
I.e. you always end at the specified value.
It will never be quite perfect: in case of increasing flows, you can redistribute, but you cannot when a flow goes to zero. In that case, you have to make the jump suddenly (but currently we're doing that for every change).
I'm guessing that in most cases, you can redistribute flows over time. You have to select some intermediate
t
value and some degree of "over-" or "undershoot". I think in principle, you'd want to stick to the forward fill curve as closely as possible, which means the intermediatet
lies close to the initialt
; you could specify an allowed absolute allowed deviation which results in the intermediate; if the necessary redistribution is less than that, you simply select the intermediate t at half the coupler time step.Concluding
I think you could achieve this with:
Notes
dt
isn't available during interpolation...The text was updated successfully, but these errors were encountered: