Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add a way to predict the time the /v1/player/batch endpoint takes. #6

Open
Drummersbrother opened this issue Jul 19, 2017 · 1 comment

Comments

@Drummersbrother
Copy link

The /v1/player/batch endpoint does not behave consistently when it comes to time. It sometimes takes more than 8 seconds to return when I query for 3 steam accounts that I've queried before. Other times it takes less than 0.2 seconds for the same request.
I understand that this can not be completely fixed (the data needs to be updated in some way), but it would be very nice to have some way of predicting how long a given request to the endpoint will take.

Please add some way of predicting how long a batch request will take.
Suggestion for a solution:

  • A new endpoint that returns if a user (unique ID) is already in the RLS database and doesn't need to be updated (can be returned quickly by the batch endpoint).
  • A parameter (probably in the POST json data) that chooses between 2 behaviours. One makes the batch endpoint have the current behaviour (load data for all requested users, even if it takes time). The other makes it only return data for the users that it doesn't need to update, and therefore returns the data quickly.
@AeonLucid
Copy link
Member

Second suggestion seems reasonable. I'd like to do that for the single player endpoint too.

I'll also look at improving the batch method so it updates faster.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants