Releases: geospace-code/h5fortran
use generated templates to avoid code duplication
Since each rank needs a subroutine (and select rank wouldn't help much for this) we use generated code templates for this (via CMake or Meson). This increases code quality in general.
avoided the name %rank
for possible confusion with intrinsic rank()
Noted that GCC 10.2.0 is making spurious warning over intrinsic :: rank
that is needed to workaround an Intel compiler bug.
BUGFIX, features
- fix bug where finalize() closed ALL files instead of just one file.
- add
hdf5_close()
procedure, good to close at end of program to ensure no dangling files remain - add flush() method, useful for long-running programs where periodic flushes help preserve data thus far if program crashes (run out of HPC time allotment for example)
- add is_open property, mostly used internally to guard against users calling methods on closed files, or double-initializing an open handle.
simplify pathlib generation
internal bugfix
status=scratch, more careful access handling
- status='scratch' much like in Fortran, expect you can specify an absolute directory. If not, Fortran uses system temp dirs it detects
- make access= parameter more clear and correct
add %ndims %rank method
These methods are equivalent
more robust config step, add "is_hdf5" file
- added
is_hdf5(filename)
function that returns.true.
if a file is HDF5 - put stricter tests in CMake and Meson configure step to help user understand if there is a runtime HDF5 problem
- dedupe code by putting more into check() internal function
test robustness
test_deflate: allocate instead of stack memory to avoid random crashes of test. This only affects the test, not the library.
added h5exist(filename, datasetname) convenience function
robustness improvements
- use native cmake features instead of custom logic
- implicit none (type, external)
docs, cmake robustness
- docs: break up big readme into several linked files
- cmake: lint
add read slicing
added ability to slice arrays on read. This allows indexing into huge HDF5 disk variables and reading part of an array, for 1-D to 7-D.
Added HDF5 library build scripts, for those who need to build HDF5 themselves, under scripts/compile_hdf5.py