From 701923b202f7f1cc52e818afcf43873c30071617 Mon Sep 17 00:00:00 2001 From: Jeffrey Douangpaseuth <11084623+Nephery@users.noreply.github.com> Date: Mon, 3 Jun 2024 16:33:48 -0400 Subject: [PATCH] add docs producer batched msg handling --- .../README.adoc | 43 ++++++++++++++++++- 1 file changed, 41 insertions(+), 2 deletions(-) diff --git a/solace-spring-cloud-starters/solace-spring-cloud-stream-starter/README.adoc b/solace-spring-cloud-starters/solace-spring-cloud-stream-starter/README.adoc index 6d024391..6915dade 100644 --- a/solace-spring-cloud-starters/solace-spring-cloud-stream-starter/README.adoc +++ b/solace-spring-cloud-starters/solace-spring-cloud-stream-starter/README.adoc @@ -383,7 +383,8 @@ When set to `true`, messages will be delivered using local transactions. + Default: `false` + -NOTE: The maximum transaction size is 256 messages. +NOTE: The maximum transaction size is 256 messages. + +The size of the transaction is 1 when the binding receives a regular Spring message. Otherwise, if it receives a <>, then the transaction size is equal to the batch size. provisionDurableQueue:: Whether to provision durable queues for non-anonymous consumer groups or queue destinations. This should only be set to `false` if you have externally pre-provisioned the required queue on the message broker. @@ -830,7 +831,8 @@ Though note that there are few limitations: . `concurrency` > 1 is not supported with auto-provisioned topic endpoints. . Setting `provisionDurableQueue` to `false` disables endpoint configuration validation. Meaning that point 1 cannot be validated. In this scenario, it is the developer's responsibility to ensure that point 1 is followed. -== Batch Consumers +== Batched Messaging +=== Batch Consumers https://docs.spring.io/spring-cloud-stream/docs/{scst-version}/reference/html/spring-cloud-stream.html#_batch_consumers[Batch consumers] can be enabled by setting `spring.cloud.stream.bindings..consumer.batch-mode` to `true`. In which case, batched messages may be consumed as follows: @@ -872,6 +874,43 @@ See <> for more info regarding this binder's natively supp To create a batch of messages, the binder will consume messages from the PubSub+ broker until either a maximum batch size or timeout has been achieved. After which, the binder will compose the batch message and send it to the consumer handler for processing. Both these batching parameters can be configured using the `batchMaxSize` and `batchTimeout` consumer config options. +=== Batch Producers + +Similar to batch consumers, batched messages may also be published through the producer binding: + +[source,java] +---- +@Bean +Supplier>> output() { + return () -> { + List batchedPayloads = new ArrayList<>(); + List> batchedHeaders = new ArrayList<>(); + + for (int i = 0; i < 100; i++) { + // Create batched message contents + batchedPayloads.add(new Payload(i)); + batchedHeaders.add(Map.of("my-header", "my-header-value")); + } + + // construct batched message + return MessageBuilder.withPayload(batchedPayloads) + .setHeader(SolaceBinderHeaders.BATCHED_HEADERS, batchedHeaders) + .build(); + }; +} +---- + +The producer binding will look for the `solace_scst_batchedHeaders` message header to determine if the supplied Spring message is either a batched Spring message or a regular Spring message. + +If the producer binding detects that it has received a batched Spring message, then it will individually publish each item in the batch. + +[NOTE] +==== +.Publishing Batched Messages using Transacted Producer Bindings + +When `transacted=true`, the size of the transaction is equal to the size of the batched Spring message. +==== + == Partitioning [NOTE]