Can't get batch working with s3 sink #21799
Replies: 2 comments 3 replies
-
Hi @extremeandy, at a first glance your config looks good. Setting How many events reach your
You are correct that
Correct. In a sense that batching determines how we bundle bytes into requests before sending telemetry downstream. However, it's worth keeping an eye on the buffer metrics to see if backpressure building up. Which might not be the case here. |
Beta Was this translation helpful? Give feedback.
-
This may be related to #21696 🤔 |
Beta Was this translation helpful? Give feedback.
-
I have the following configured in
vector.yaml
:Everything seems to work except the batch config. I am seeing very small files (~3000 lines, ~50kB).
Documentation seems to suggest that max_bytes is optional, so I didn't set that.
And from what I'm reading, the buffer size should be independent of the batch size, so I didn't configure anything there either.
Am I missing something?
Edit: Setting
max_bytes: 500000000
resolved this for me, though docs claim this is optional.Beta Was this translation helpful? Give feedback.
All reactions