You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Perhaps we can create a simple shell script that users can run on e.g. netlify runner, which downloads and extracts a copy of the repo from the snapshot api. So that way users can control when the mirror is updated by triggering the build on netlify.
Also for github-pages perhaps it might be possible to use the github deploy API and simply set the artifact_url to the snapshot url? That would be really convenient if it works.
Let's experiment a bit with a few methods and see if we can easily set up a self-updating mirror for e.g. the ropensci universe, which is about 6GB total if we include all binaries.
The text was updated successfully, but these errors were encountered:
We could do both. I thought the idea was that we would try to aggregate information from technotes into the docs as well? Such that it becomes a searchable reference? Otherwise the information becomes even more difficult to find.
Elaborate on this: https://docs.r-universe.dev/install/reproducibility.html#using-snapshots
Instead of just linking to https://github.com/jeroen/backup, it would be nice to explain this in more detail.
Perhaps we can create a simple shell script that users can run on e.g. netlify runner, which downloads and extracts a copy of the repo from the snapshot api. So that way users can control when the mirror is updated by triggering the build on netlify.
Also for github-pages perhaps it might be possible to use the github deploy API and simply set the
artifact_url
to the snapshot url? That would be really convenient if it works.Let's experiment a bit with a few methods and see if we can easily set up a self-updating mirror for e.g. the ropensci universe, which is about 6GB total if we include all binaries.
The text was updated successfully, but these errors were encountered: