Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

predictions should not all be loaded into memory #2

Open
andregraubner opened this issue Dec 2, 2020 · 1 comment
Open

predictions should not all be loaded into memory #2

andregraubner opened this issue Dec 2, 2020 · 1 comment

Comments

@andregraubner
Copy link
Owner

When predicting, we are just loading all the predictions into memory. This will run OOM fast on smaller machines.

@andregraubner
Copy link
Owner Author

When at this, it would be great to give more structure to the predictions (at least when using a high-level abstraction for predicting): E.g. use the file structure convention of the input and output an xarray (holds information about lat/lon etc).
Since the file standard will most likely be a two-group nc file (see #4) where each group can be loaded into an xarray, making the output match the xarray you would get from reading the label group of an input seems reasonable.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant