You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When creating a batch job, the user can specify a "budget":
POST /jobs
Maximum amount of costs the request is allowed to produce. The value MUST be specified in the currency of the back-end. No limits apply, if the value is null or the back-end has no currency set in GET /.
The need for such a feature came up last week, when we had the problem that jobs suddenly took significantly longer to run and consequently consumed significantly more credits than normal due to cluster saturation
I'm not sure if we can reliably predict the cost of a job before running it, so blocking the start of a job because of budget constraints is probably not feasible at the moment.
However we could look into automatically canceling a job when it goes over budget (assuming we can track that in "real time")
When creating a batch job, the user can specify a "budget":
This attribute is supported e.g. by the python client (e.g. see https://open-eo.github.io/openeo-python-client/api.html#openeo.rest.datacube.DataCube.create_job), but the geopyspark backend ignores this
The text was updated successfully, but these errors were encountered: