Skip to content

Releases: airtai/faststream

v0.4.4

24 Feb 09:02
77d8a66
Compare
Choose a tag to compare

What's Changed

Add RedisStream batch size option

@broker.subscriber(stream=StreamSub("input", batch=True, max_records=3))
async def on_input_data(msgs: list[str]):
    assert len(msgs) <= 3
  • Update Release Notes for 0.4.3 by @faststream-release-notes-updater in #1247
  • docs: add manual run section by @Lancetnik in #1249
  • feat (#1252): respect Redis StreamSub last_id with consumer group by @Lancetnik in #1256
  • fix: correct Redis consumer group behavior by @Lancetnik in #1258
  • feat: add Redis Stream max_records option by @Lancetnik in #1259

Full Changelog: 0.4.3...0.4.4

v0.4.3

20 Feb 05:03
ffe1566
Compare
Choose a tag to compare

What's Changed

Allow to specify Redis Stream maxlen option in publisher:

@broker.publisher(stream=StreamSub("Output", maxlen=10))
async def on_input_data():
    ....

Full Changelog: 0.4.2...0.4.3

v0.4.2

02 Feb 06:07
f9f2c17
Compare
Choose a tag to compare

What's Changed

Bug fixes

Full Changelog: 0.4.1...0.4.2

v0.4.1

01 Feb 15:47
380740d
Compare
Choose a tag to compare

What's Changed

Bug fixes

Documentation

Full Changelog: 0.4.0...0.4.1

v0.4.0

30 Jan 14:22
a82cd95
Compare
Choose a tag to compare

What's Changed

This release adds support for the Confluent's Python Client for Apache Kafka (TM). Confluent's Python Client for Apache Kafka does not support natively async functions and its integration with modern async-based services is a bit trickier. That was the reason why our initial supported by Kafka broker used aiokafka. However, that choice was a less fortunate one as it is as well maintained as the Confluent version. After receiving numerous requests, we finally decided to bite the bullet and create an async wrapper around Confluent's Python Client and add full support for it in FastStream.

If you want to try it out, install it first with:

pip install "faststream[confluent]>=0.4.0"

To connect to Kafka using the FastStream KafkaBroker module, follow these steps:

  1. Initialize the KafkaBroker instance: Start by initializing a KafkaBroker instance with the necessary configuration, including Kafka broker address.

  2. Create your processing logic: Write a function that will consume the incoming messages in the defined format and produce a response to the defined topic

  3. Decorate your processing function: To connect your processing function to the desired Kafka topics you need to decorate it with @broker.subscriber(...) and @broker.publisher(...) decorators. Now, after you start your application, your processing function will be called whenever a new message in the subscribed topic is available and produce the function return value to the topic defined in the publisher decorator.

Here's a simplified code example demonstrating how to establish a connection to Kafka using FastStream's KafkaBroker module:

from faststream import FastStream
from faststream.confluent import KafkaBroker

broker = KafkaBroker("localhost:9092")
app = FastStream(broker)

@broker.subscriber("in-topic")
@broker.publisher("out-topic")
async def handle_msg(user: str, user_id: int) -> str:
    return f"User: {user_id} - {user} registered"

For more information, please visit the documentation at:

https://faststream.airt.ai/latest/confluent/

List of Changes

New Contributors

Full Changelog: 0.3.13...0.4.0

v0.4.0rc0

12 Jan 07:49
5cac95b
Compare
Choose a tag to compare
v0.4.0rc0 Pre-release
Pre-release

What's Changed

This is a preview version of 0.4.0 release introducing support for Confluent-based Kafka broker.

Here's a simplified code example demonstrating how to establish a connection to Kafka using FastStream's KafkaBroker module:

from faststream import FastStream
from faststream.confluent import KafkaBroker

broker = KafkaBroker("localhost:9092")
app = FastStream(broker)

@broker.subscriber("in-topic")
@broker.publisher("out-topic")
async def handle_msg(user: str, user_id: int) -> str:
    return f"User: {user_id} - {user} registered"

Changes

Full Changelog: 0.3.13...0.4.0rc0

v0.3.13

06 Jan 16:09
763c70b
Compare
Choose a tag to compare

What's Changed

New features

Bug fixes

  • Fix minor typos in documentation and code by @mj0nez in #1116

New Contributors

Full Changelog: 0.3.12...0.3.13

v0.3.12

04 Jan 11:56
accbd78
Compare
Choose a tag to compare

What's Changed

Bug fixes

Misc

Full Changelog: 0.3.11...0.3.12

v0.3.11

27 Dec 18:54
a5caa68
Compare
Choose a tag to compare

What's Changed

NATS concurent subscriber:

By default, NATS subscriber consumes messages with a block per subject. So, you can't process multiple messages from the same subject at the same time. But, with the broker.subscriber(..., max_workers=...) option, you can! It creates an async tasks pool to consume multiple messages from the same subject and allows you to process them concurrently!

from faststream import FastStream
from faststream.nats import NatsBroker

broker = NatsBroker()
app = FastStream()

@broker.subscriber("test-subject", max_workers=10)
async def handler(...):
   """Can process up to 10 messages concurrently."""
  • Update Release Notes for 0.3.10 by @faststream-release-notes-updater in #1091
  • fix (#1100): FastAPI 0.106 compatibility by @Lancetnik in #1102

Full Changelog: 0.3.10...0.3.11

v0.3.10

23 Dec 19:09
8ab9d30
Compare
Choose a tag to compare

What's Changed

New features

Bug fixes

Documentation

Other

Full Changelog: 0.3.9...0.3.10