You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
If a user attempts to submit a simulation through the API with a large config value, the request to the /repos/${repoOwner}/${repoName}/dispatches endpoint results in a 422 Unprocessable Entity, with the message "client_payload is too large.". The limit for the payload doesn't seem to be documented, but others suggest it's around ~70kB.
I can see three options here:
We document some limit and reject requests bigger than it
Probably not viable for the work we're currently trying to do
We compress (some of) the client_payload before sending it to GitHub, and then decompress it again in the model-runner
Should be straight-forward, although technically doesn't remove the limit entirely
We write (some of) the client_payload into an object in the Blob Store, then send only the URL in the API request, and then fetch the file in the model-runner
Most future-proof solution, but does require the web-ui to write to the blob store, while previously it only read
The text was updated successfully, but these errors were encountered:
A version using option 2, compression, has now been released, which should prevent this error in most cases. The last option is probably still the best however, and supports #44, so I'm going to leave this open until that is implemented.
If a user attempts to submit a simulation through the API with a large
config
value, the request to the/repos/${repoOwner}/${repoName}/dispatches
endpoint results in a422 Unprocessable Entity
, with the message"client_payload is too large."
. The limit for the payload doesn't seem to be documented, but others suggest it's around ~70kB.I can see three options here:
The text was updated successfully, but these errors were encountered: