Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make torch forecasting model loadable from BufferedReader #2536

Open
MarcBresson opened this issue Sep 20, 2024 · 0 comments
Open

Make torch forecasting model loadable from BufferedReader #2536

MarcBresson opened this issue Sep 20, 2024 · 0 comments
Labels
feature request Use this label to request a new feature

Comments

@MarcBresson
Copy link
Contributor

Is your feature request related to a current problem? Please describe.

To load a torch forecaster, we need to provide a path. However, when the model is stored on the cloud (in a s3 for instance), it implies to first copy the file on the local storage to then give the appropriate path.

Describe proposed solution

The solution to that would be to add two arguments to https://github.com/unit8co/darts/blob/master/darts/models/forecasting/torch_forecasting_model.py#L1708 that are buffers (typically obtained with open("filename.pt", mode="rb") as in here). The first one would be the buffer to the base TorchForecastingModel, and the second one a buffer to the PyTorch LightningModule

Describe potential alternatives

None

Additional context

None

@MarcBresson MarcBresson added the triage Issue waiting for triaging label Sep 20, 2024
@madtoinou madtoinou added feature request Use this label to request a new feature and removed triage Issue waiting for triaging labels Sep 23, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature request Use this label to request a new feature
Projects
None yet
Development

No branches or pull requests

2 participants