-
Notifications
You must be signed in to change notification settings - Fork 2.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[exporter/prometheusremotewrite] Make maxBatchByteSize
configurable
#21911
Comments
Pinging code owners:
See Adding Labels via Comments if you do not have permissions to add labels yourself. |
maxBatchByteSize
configurable in prometheusremotewrite
exportermaxBatchByteSize
configurable
@yotamloe could you pls submit a PR to support it? :-P |
@JaredTan95 Sure I will open a pr for the feature, I hope I will manage to do it this week or early next week. |
Created a PR for this issue #23447 |
This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping Pinging code owners:
See Adding Labels via Comments if you do not have permissions to add labels yourself. |
…23447) Adding a feature: Making `maxBatchByteSize` a configurable parameter. This would allow users to adjust it based on the capabilities of their specific remote storage, offering more flexibility and potentially improving performance. Example: ```yaml exporters: prometheusremotewrite: endpoint: "https://my-cortex:7900/api/v1/push" max_batch_byte_size: 5000000 ``` Fixes #21911 **Testing:** <Describe what testing was performed and which tests were added.> Added `MaxBatchByteSize` to `TestLoadConfig(t *testing.T)` in `config_test.go` **Documentation:** <Describe the documentation added.> Added to `README.md`: - `max_batch_byte_size` (default = `3000000` -> `~2.861 mb`): Maximum size of a batch of samples to be sent to the remote write endpoint. If the batch size is larger than this value, it will be split into multiple batches. --------- Co-authored-by: Alex Boten <[email protected]>
…pen-telemetry#23447) Adding a feature: Making `maxBatchByteSize` a configurable parameter. This would allow users to adjust it based on the capabilities of their specific remote storage, offering more flexibility and potentially improving performance. Example: ```yaml exporters: prometheusremotewrite: endpoint: "https://my-cortex:7900/api/v1/push" max_batch_byte_size: 5000000 ``` Fixes open-telemetry#21911 **Testing:** <Describe what testing was performed and which tests were added.> Added `MaxBatchByteSize` to `TestLoadConfig(t *testing.T)` in `config_test.go` **Documentation:** <Describe the documentation added.> Added to `README.md`: - `max_batch_byte_size` (default = `3000000` -> `~2.861 mb`): Maximum size of a batch of samples to be sent to the remote write endpoint. If the batch size is larger than this value, it will be split into multiple batches. --------- Co-authored-by: Alex Boten <[email protected]>
…pen-telemetry#23447) Adding a feature: Making `maxBatchByteSize` a configurable parameter. This would allow users to adjust it based on the capabilities of their specific remote storage, offering more flexibility and potentially improving performance. Example: ```yaml exporters: prometheusremotewrite: endpoint: "https://my-cortex:7900/api/v1/push" max_batch_byte_size: 5000000 ``` Fixes open-telemetry#21911 **Testing:** <Describe what testing was performed and which tests were added.> Added `MaxBatchByteSize` to `TestLoadConfig(t *testing.T)` in `config_test.go` **Documentation:** <Describe the documentation added.> Added to `README.md`: - `max_batch_byte_size` (default = `3000000` -> `~2.861 mb`): Maximum size of a batch of samples to be sent to the remote write endpoint. If the batch size is larger than this value, it will be split into multiple batches. --------- Co-authored-by: Alex Boten <[email protected]>
Component(s)
exporter/prometheusremotewrite
Is your feature request related to a problem? Please describe.
The
maxBatchByteSize
in theprometheusremotewrite
exporter which is currently set to a static value of ~2.861 MB (as seen here).Example use case
My remote storage ingestion endpoint can handle around 9 MB per POST request. I'm thinking that if we increase the
maxBatchByteSize
value, we could potentially send metrics from the queue faster, and this could lead to better performance when scaling up.Describe the solution you'd like
I'd like to suggest making
maxBatchByteSize
a configurable parameter. This would allow users to adjust it based on the capabilities of their specific remote storage, offering more flexibility and potentially improving performance. Something like:Describe alternatives you've considered
N/A
Additional context
I would be happy to contribute to this feature and open a PR for it.
I'm also curious to know if there are any inherent limitations with the
maxBatchByteSize
. I'd love to understand why it's currently static. Any insights on this would help me understand the current design better.The text was updated successfully, but these errors were encountered: