-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Offline downloads record limit #413
Comments
@nickdos will be able to add more details if need it but my understanding is that the limit is there for two reasons:
|
@javier-molina , I was just observing the difference with the old system. If the 500,000 record download limit is intended, that is perfectly fine. It might be good to issue a warning and not even start the download if the query yields more than 500,000 records (again, not a show-stopper). If I am going to need a 500,000+ record download, it is for a very specific thing (all plant records from the VBA) and will not happen more than once a year, so I can make special arrangements. In future, a power user role or something with API keys might be a good idea. |
I thought the limit was going to be higher than 500,000. @peggynewman is best placed to advise on this. My understanding is we want users to be able to download the single largest dataset but not the "whole ALA". So limit needs to be something like eBird or BirdLife number of records. |
I did a download for the following query, https://biocache-test.ala.org.au/occurrences/search?&q=*&fq=data_resource_uid%3Adr376&disableAllQualityFilters=true, which contained only 500,000 rows.
The same download in the Biocache, https://biocache.ala.org.au/occurrences/search?&q=*&fq=data_resource_uid%3Adr376&disableAllQualityFilters=true, gives me all 994,654 records.
The Biocache Store has slightly more records than LA Pipelines, but not that many more.
I have never done such a big download before, but I can see myself doing bigger downloads in the future. Is the lower record limit on purpose?
The text was updated successfully, but these errors were encountered: