-
-
Notifications
You must be signed in to change notification settings - Fork 125
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Handling datasets across HoloViz #340
Comments
xref holoviz/lumen#369 (thanks @hoxbro!) |
I agree it is a mess but I think all of the approaches you listed have validity. The one approach I like the least is to make intake an extra dependency (unless you know that intake is required for some other reason, I prefer not to have another dependency just to fetch sample data!). One idea I've just had: mirror bokeh sample data, the samples shipped with the package and everything else into an S3 bucket. Then have a static page served from S3 describing all the datasets with the S3 URL to get to it - then we could point to this one page every time we need sample data. The biggest downside is that this would require an internet connection for all examples... |
For the gridded data examples in hvPlot we also pull from xarray sample data, which was an issue last week for a user who had to install various dependent packages before being able to run the examples, so it would be nice to clean this up. Maybe best would be to have our own S3 bucket to fetch from along with some other fallback for those without live internet access at the time of the examples (some way to pre-fetch the datasets locally such that we find them there before checking on S3). |
superseded by #394 |
The datasets used across HoloViz come from various places:
sampledata
pd.read_csv('https://...')
)I suggest we define a list of datasets we want to use and define a way to manage them (where are they hosted? how to get fetch them?).
Having so many sources isn't ideal:
This is partially related to how the
/examples
folder should be handled holoviz-dev/nbsite#243The text was updated successfully, but these errors were encountered: