You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This is currently not handled and is only likely an issue with very large input chunk sizes. However it would be good to be able to handle this nicely.
At the moment this would result in a failed upload.
First option is just a check and throw an error sooner.
Another option is in #10 the buffer part could take some excess data and this would expand the limit to about 10 GiB.
Could also ensure to split parts up so there is 10000.
Beyond that there would need to be a part size estimate from the user, and I don't think that is needed for writing COGs.
The main problem is that the size of parts is unknown when building the graph so large parts can't be handled appropriately in the graph and when partitioning the data. The way around this would be to use futures to construct the writing process on the fly. At this stage I would rather avoid doing this.
The text was updated successfully, but these errors were encountered:
This is currently not handled and is only likely an issue with very large input chunk sizes. However it would be good to be able to handle this nicely.
At the moment this would result in a failed upload.
First option is just a check and throw an error sooner.
Another option is in #10 the buffer part could take some excess data and this would expand the limit to about 10 GiB.
Could also ensure to split parts up so there is 10000.
Beyond that there would need to be a part size estimate from the user, and I don't think that is needed for writing COGs.
The main problem is that the size of parts is unknown when building the graph so large parts can't be handled appropriately in the graph and when partitioning the data. The way around this would be to use futures to construct the writing process on the fly. At this stage I would rather avoid doing this.
The text was updated successfully, but these errors were encountered: