You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Sort of... In HSDS we use what is basically the hdf5-json schema for metadata, but chunk data is stored as blobs. See: https://github.com/HDFGroup/hsds/blob/master/docs/design/obj_store_schema/obj_store_schema_v2.md for a description. This works pretty well - we've used it for "files" as large as 50 TB. "files' is in quotes since what you get at the end is a large collection of files in a tree structure.
This was done to support the HDF service, but the same approach could be used outside the server.
h5tojson.py and jsontoh5.py can't convert files whose size is comparable to the amount of physical memory on the machine the convertor is running on.
The text was updated successfully, but these errors were encountered: