Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reboot of Frost Action, Potential using PRISM #84

Open
brownag opened this issue Feb 8, 2018 · 18 comments
Open

Reboot of Frost Action, Potential using PRISM #84

brownag opened this issue Feb 8, 2018 · 18 comments

Comments

@brownag
Copy link
Member

brownag commented Feb 8, 2018

On the West Coast it has long been recognized that potential frost action (250 degree days below zero isoline) does not align well with the mesic-thermic break (which is commonly used to predict frost action in mid-continental areas).

Links to NSSH:

From NSSH Part 618.33 Frost Action, Potential:

Part 618, Subpart B, Exhibits, Section 618.85 is a map that shows the design freezing index values in the continental United States. The values are the number of degree days below 0° C for the coldest year in a period of 10 years . The values indicate duration and intensity of freezing temperatures. The 250 isoline is the approximate boundary below which frost action ceases to be a problem.

We currently derive GDD and FFD estimates from PRISM daily data. It would not be too challenging to modify the GDD routine to calculate "number of degree days below 0 degrees C" ( AKA freezing index ).

The goal would be creating a raster map that is analogous to the one below (from NSSH 618.85), to provide further quantitative justification of assignments (or lack thereof) of Frost Action, Potential in MLRA 22A.

ddbz_isolines

@brownag
Copy link
Member Author

brownag commented Feb 8, 2018

FURTHERMORE: following the table from NSSH 618.84 would suggest that all of our mesic components should have some potential frost action, and would get a "moderate" class in most instances. While portraying potential frost action even in areas that may not require it may be the most conservative approach, it would be good to understand where our map units sit with respect to the 250 isoline.

@brownag
Copy link
Member Author

brownag commented Feb 8, 2018

Question: how to interpret the "coldest year in 10 years" or "average of three coldest years in 30 years"? How would this compare to, say, the 0.90 quantile for the freezing index (AKA design freezing index) derived from 30 year PRISM daily data?

@brownag
Copy link
Member Author

brownag commented Feb 16, 2018

We decided the "coldest" year would be defined by the year with the greatest number of accumulated degree days below zero, and not some unrelated metric such as the year with the lowest MAAT.

@dylanbeaudette
Copy link
Member

dylanbeaudette commented Feb 16, 2018

Using the daily 800m PRISM data (1981-2010) we get this approximation of the design freezing index (deg. C):

dfi-q90-deg-c-250-isoline

Methods:

  • using units of degrees Celsius, and daily average air temperature (Tavg)
  • freezing degree days for a single year: FI = sum( abs( min(0, Tavg) ) )
  • design freezing index, over 30 year record: DFI = Q90( FI ) where FI is the stack of annual FI

Notes:

  • There is a fairly large difference in where the 250 DFI isoline falls, depending on the temperature units.
  • The 90th percentile of FI seems to track the notion of "coldest year in 10 years".
  • The "average of 3 coldest years in 30" method gives different results, but spatial patterns are the same.

Remaining questions:

  • Is the Q90 DFI sufficiently similar to the vague definition in the NSSH?
  • What temperature units should the magical 250 isoline be based on?

@brownag
Copy link
Member Author

brownag commented Feb 16, 2018

For comparison to local west coast data, we aggregated all available data from CA DWR CDEC stations and calculated the design freezing index.

Design freezing index calculated for each station*year combination. Sites with less than 10 years of data were omitted, and low-DFI stations (less than 10 Fahrenheit degree-days) were also removed to avoid complexities of the zero-inflated data with widely varying predictors. This wound up removing most of the coastal stations, but left good coverage across MLRA 22A.

The following figure shows the PRISM daily-based (hollow), Elevation-based regression using CDEC(solid) predictions as a function of the observed value at the CDEC locations.
dfi90_obs_pred

This is the plot of estimated DFI90, using a regression model (uses elevation as predictor) CDEC station observations (black points). CA630 extent for reference.
cdec_dfi90

This shows the 250 isoline (assuming C-degree days), relative to CA630. Green areas are where DFI90>250 == TRUE.
cdec_dfi90_gt250_c

And this one shows the difference in location of the isoline when using C versus F degree days. Green means there is no difference (i.e. DFI90_C>250 == DFI90_F>250), and white means there is a difference. F degree days bring isoline to a lower elevation, but still outside CA630.
cdec_dfi90_cvsf

@brownag
Copy link
Member Author

brownag commented Feb 20, 2018

Considering use of additional predictors. Annual beam radiance (250m and 30m) didn't seem to improve estimate much, probably due to lack of stratification of stations across ABR and elevation gradient.

The "percent precipitation as rain" derivative Dylan made shows some promise (marginal improvement in information criterion of model) ... but is highly correlated with elevation (note variance-inflation factor of ~9).

> car::vif(step(lm(dfi90~Elevation+pctrain,data=stations.sub)))
Start:  AIC=1432.25
dfi90 ~ Elevation + pctrain

            Df Sum of Sq      RSS    AIC
<none>                   23030633 1432.2
- Elevation  1    448813 23479445 1432.5
- pctrain    1   1024701 24055334 1435.3
Elevation   pctrain 
 9.165907  9.165907 

pptasrainvelevation

Perhaps try principal components regression?

@brownag
Copy link
Member Author

brownag commented Feb 21, 2018

@dylanbeaudette
Copy link
Member

Fascinating. I doubt there are many other data sources that will contribute to this model, especially since most of the PRISM stack are so highly correlated. I'll copy the bioclim stack to the L drive tomorrow--maybe there is something in there.

Did you try using the mean annual precipitation?

@brownag
Copy link
Member Author

brownag commented Feb 21, 2018

I didn't try MAP. I was tentative to go with anything that would be so tightly correlated with the elevation trend. Though if I used a method that would be robust to having such correlated predictors, perhaps it would be worthwhile. Once you get bioclim in there I'll play around with some way over-specified models, stepwise regression and maybe some principal components.

But before long I will have to bundle this up and pitch it as a future project, as redefining Frost Action and more in-depth modeling of the design index, or its logical next step: frost penetration depth, is well beyond the scope for what I am cleared to do here.

The premise of this investigation was more or less based on the idea that the 250 isoline is meaningful from the perspective of frost action. But seems like the only reference to that magical line is from that old US Army School of Engineers student reference... I haven't found it in any of the older or more recent Army Corps documents.

@dylanbeaudette
Copy link
Member

Agreed: time to nail down the definition, notify others, and move on. Let me know if I can help integrate your DBZ code into sharpshootR. There are a number of climate-related functions in there and elsewhere that should probably (eventually) be moved into a new package.

@jskovlin
Copy link
Member

jskovlin commented Mar 6, 2018

Could you use the SMAP data for validation? The SMAP_L4_Frozen_Area data looks interesting....
smap_screen_shot

@brownag
Copy link
Member Author

brownag commented Mar 6, 2018

@jskovlin The pattern looks quite similar! Neat. Worth looking at a little closer.

@brownag
Copy link
Member Author

brownag commented Mar 6, 2018

Had correspondence with Steve Campbell and Bob Dobos on this.

In lieu of actually tracking down the source document to verify (have tried in several places, through NAL, etc. to no avail) it seems likely that the units for degree days were in Fahrenheit.

We also discussed that Air-Freezing Index is conventionally calculated over some sort of a freezing season. The NSSH definition (upon revision) should address the period over which the FI is calculated. Steve suggested it was probably worth keeping the full year, not a subset (<365 days).

Offsetting the start of the year from January 1st to August 1st might give a more meaningful cumulative value for the individual freezing season (at least for our hemisphere), at the cost of having to be derived from two different calendar years. I have not yet checked whether this would give different results for the per-station DFI, but my feeling is that just using the calendar year balances out. Will verify.

@dylanbeaudette
Copy link
Member

dylanbeaudette commented May 2, 2018

Relevant notes from the component "Frost Action" calculation:

Updated 03/15/2018

Calculates the frost action potential for a component (non-irrigated condition)

CAUTION! This calculation may not predict the correct frost action in mesic temperature regimes on the West Coast.

Data elements used:
- Soil Moisture regime
- Soil Temperature regime
- Family particle size class
- Soil order
- Component Month Soil Temperature (used for mesic temp regimes on the West Coast)

  1. The calculation is based on NSSH Exhibit 618.84.
  2. Taxonomic Family particle-size classes are used in the calculation, which may not represent the particle size distribution outside the control section. If the Family paricle-size class is different from the part of the soil that freezes, manual adjustments may be needed.
  3. If the moisture regime is isomesic or warmer then a "none" frost action rating is assigned.
  4. Mesic and colder Histosols (and Histels) are given a frost action rating of "high".
  5. If the soil moisture regime, temperature regime, or family particle size class is null, a null is returned by the calculation (excluding the above two conditions).
  6. If family particle size class is null or "not used" and component kind is "miscellaneous area" or "taxon above the family", then the weighted average particle size of the top 100 cm is used. If particle size separates are null, then a null is returned.
  7. If the component is a psamments, psammaquents, or has a psammentic subgroup then "sandy" is used for the particle size class in the calculation.
  8. For contrasting family particle size classes, the first particle size class is used in the calculation.
  9. If more than one soil moisture regime is populated, then the most limiting one (e.g., aquic) is used.
  10. If a component has irrigated crop yields populated in the Component Crop Yield table, then an irrigated frost action is reported (in the report only) along with the non-irrigated potential. Only the non-irriagted frost action potentials are set in the database.
  11. If mesic and all monthly soil temperature are >= 1 degree celcius (if populated) then frost action potential is "none".

@dylanbeaudette
Copy link
Member

Item no. 11 above references this bit of code:

# On the West Coast the frost action boundary does not correspond to the thermic mesic boundary. 
# Mesic can have 'none' corresponding to the 250 freezing index isoline (below which frost action ceases to be a problem).
ELSE IF tempregime == "mesic" and soitempmm >= 1 THEN "none"	# minimum monthly soil temp in any month 

Among SSR02 component data, 98% of the component keys are missing records in the cosoiltemp table. This means (I think) that the calculation is not going to work for the bulk of our mesic map units.

@brownag
Copy link
Member Author

brownag commented Aug 20, 2018

Yes, the cosoiltemp code appeared to have been added to allow for "mesic" STR to have frost action class of "none" But it is not clear that there are any instances where the combinations of tempregime==mesic and soil temp data are all present (and minimum for monthly values exceed 1. As such, our mesics calculate as having frost action -- with the class depending on the PSC and moisture.

@dylanbeaudette
Copy link
Member

@brownag: did the DFI calculation make it into sharpshootR or some such package? It might be worth adding, as the FFD estimator has found a nice home there.

@brownag
Copy link
Member Author

brownag commented Jan 9, 2020

@dylanbeaudette Negative. But I agree it should be added.

Here is a working example of DFI calc for CDEC data -- with a routine for data cleaning (which is pretty necessary for CDEC) -- open to comment

library(sharpshootR)
library(soilDB)

getData <- function(station_id) {
  print(station_id)
  x <- try(CDECquery(id = station_id, sensor=30, interval='D', start='1900-01-01', end='2020-12-31'))
  if(class(x) == 'try-error')
    return(NULL)
  return(x)
}

#Grab a single station's data
foo <- getData("RND")

plot(data=foo, value ~ datetime, pch=".", main="Mean Daily Air Temperature Timeseries\nROUND MOUNTAIN (CDF) -  MCCLOUD, CA",
     ylab="Mean Daily Air Temperature, degrees F", xlab="Time")
abline(h=32, lwd=2, col="RED")

.SimpleTemperatureFilter <- function(y, nodata.value=-9999, bad.data.threshold=-50, quantile.threshold = 0.999) {
  #Filter known NoData values
  y[y == nodata.value] <- NA
  
  #Throw out physically unreasonable values (typically strongly negative values for CDEC)
  bad.idx <- which(y < bad.data.threshold)
  y[bad.idx] <- NA
  
  #We will use the derivative of the time series to find discontinuous/anomalous extreme values
  # but sometimes bad data occur adjacent to NoData. Encode all NAs with alternating 999 / -999 for diff()
  fix <- rep(c(999,-999), round((length(y[is.na(y)]) + 1) / 2))
  fix.idx <- which(is.na(y))
  
  #calculate the absolute change in temperature between adjacent points; duplicate first data point 
  # to preserve indexing relative to original time series (diff between duplicate and true first point equals zero)
  da.y <- abs(diff(c(y[1], abs(y))))
  
  if(length(fix.idx))
    da.y <- da.y[-fix.idx]
  
  thresh <- quantile(da.y, probs=quantile.threshold[1], na.rm=T)
  bad.idx <- which(da.y > thresh)
  y[bad.idx] <- NA
  
  thresh2 <- quantile(y, probs=c(1 - quantile.threshold, quantile.threshold), na.rm=T) 
  y[y <= thresh2[1] | y >= thresh2[2]] <- NA
  
  return(y)
}

FreezingIndex <- function(d, frostTemp, p, FUN = .SimpleTemperatureFilter, ...) {
  d.y <- do.call('rbind',
          lapply(split(d, f = d$year), 
            function(i) { 
                a.y <- alignDOY(i$datetime, i$value)
                ffp <- frostFreePeriod(i)
                a.y <- .SimpleTemperatureFilter(a.y)
                a.y[a.y > frostTemp] <- NA
                a.y <- -(a.y - frostTemp)
                if(!is.null(ffp))
                  return(sum(a.y, na.rm=T))
                return(NA)
            }))
  n.yrs <- nrow(d.y)
  q <- quantile(d.y[,1], probs=p, na.rm=T)[1]
  return(q)
}

DesignFreezingIndex <- function(d, ...) {
  frostTemp <- 32
  p <- 0.9 #the "design" freezing index is the Freezing Index 90th percentile
  return(FreezingIndex(d, frostTemp, p, ...))
}

DesignFreezingIndex(foo)

DesignFreezingIndex(foo)
90%
390

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants