Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

cache not storing predictions #146

Open
rbawden opened this issue Dec 27, 2022 · 0 comments
Open

cache not storing predictions #146

rbawden opened this issue Dec 27, 2022 · 0 comments

Comments

@rbawden
Copy link

rbawden commented Dec 27, 2022

The "--use_cache" argument only seems to be caching the model and not the predictions (contrarily to what is indicated in the readme). I am missing something here, or is this not currently implemented? I am running into the problem of reaching the time limit on a run and therefore all predictions are lost, whereas if they could be cached, then I would be able to start the run where I left off.

Thank you for your help!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant