Releases: pipecat-ai/pipecat-flows
v0.0.14
v0.0.13
Added
- Added context update strategies to control how context is managed during node
transitions:APPEND
: Add new messages to existing context (default behavior)RESET
: Clear and replace context with new messages and most recent
function call resultsRESET_WITH_SUMMARY
: Reset context but include an LLM-generated summary
along with the new messages- Strategies can be set globally or per-node
- Includes automatic fallback to RESET if summary generation fails
Example usage:
# Global strategy
flow_manager = FlowManager(
context_strategy=ContextStrategyConfig(
strategy=ContextStrategy.RESET
)
)
# Per-node strategy
node_config = {
"task_messages": [...],
"functions": [...],
"context_strategy": ContextStrategyConfig(
strategy=ContextStrategy.RESET_WITH_SUMMARY,
summary_prompt="Summarize the key points discussed so far."
)
}
- Added a new function called
get_current_context
which provides access to
the LLM context.
Example usage:
# Access current conversation context
context = flow_manager.get_current_context()
- Added a new dynamic example called
restaurant_reservation.py
.
Changed
-
Transition callbacks now receive function results directly as a second argument:
async def handle_transition(args: Dict, result: FlowResult, flow_manager: FlowManager)
.
This enables direct access to typed function results for making routing decisions.
For backwards compatibility, the two-argument signature
(args: Dict, flow_manager: FlowManager)
is still supported. -
Updated dynamic examples to use the new result argument.
Deprecated
- The
tts
parameter inFlowManager.__init__()
is now deprecated and will
be removed in a future version. Thetts_say
action now pushes a
TTSSpeakFrame
.
v0.0.12
Added
- Support for inline action handlers in flow configuration:
- Actions can now be registered via handler field in config
- Maintains backwards compatibility with manual registration
- Built-in actions (
tts_say
,end_conversation
) work without changes
Example of the new pattern:
"pre_actions": [
{
"type": "check_status",
"handler": check_status_handler
}
]
Changed
- Updated dynamic flows to use per-function, inline transition callbacks:
- Removed global
transition_callback
from FlowManager initialization - Transition handlers are now specified directly in function definitions
- Dynamic transitions are now specified similarly to the static flows'
transition_to
field - Breaking change: Dynamic flows must now specify transition callbacks in
function configuration
- Removed global
Example of the new pattern:
# Before - global transition callback
flow_manager = FlowManager(
transition_callback=handle_transition
)
# After - inline transition callbacks
def create_node() -> NodeConfig:
return {
"functions": [{
"type": "function",
"function": {
"name": "collect_age",
"handler": collect_age,
"description": "Record user's age",
"parameters": {...},
"transition_callback": handle_age_collection
}
}]
}
- Updated dynamic flow examples to use the new
transition_callback
pattern.
Fixed
- Fixed an issue where multiple, consecutive function calls could result in two completions.
v0.0.11
Changed
-
Updated
FlowManager
to more predictably handle function calls:- Edge functions (which transition to a new node) now result in an LLM
completion after both the function call and messages are added to the
LLM's context. - Node functions (which execute a function call without transitioning nodes)
result in an LLM completion upon the function call result returning. - This change also improves the reliability of the pre- and post-action
execution timing.
- Edge functions (which transition to a new node) now result in an LLM
-
Breaking changes:
- The FlowManager has a new required arg,
context_aggregator
. - Pipecat's minimum version has been updated to 0.0.53 in order to use the
newFunctionCallResultProperties
frame.
- The FlowManager has a new required arg,
-
Updated all examples to align with the new changes.
v0.0.10
Changed
-
Nodes now have two message types to better delineate defining the role or
persona of the bot from the task it needs to accomplish. The message types are:role_messages
, which defines the personality or role of the bottask_messages
, which defines the task to be completed for a given node
-
role_messages
can be defined for the initial node and then inherited by
subsequent nodes. You can treat this as an LLM "system" message. -
Simplified FlowManager initialization by removing the need for manual context
setup in both static and dynamic flows. Now, you need to create aFlowManager
and initialize it to start the flow. -
All examples have been updated to align with the API changes.
Fixed
- Fixed an issue where importing the Flows module would require OpenAI,
Anthropic, and Google LLM modules.
v0.0.9
Changed
- Fixed function handler registration in FlowManager to handle
__function__:
tokens- Previously, the handler string was used directly, causing "not callable" errors
- Now correctly looks up and uses the actual function object from the main module
- Supports both direct function references and function names exported from the Flows editor
v0.0.8
Changed
- Improved type safety in FlowManager by requiring keyword arguments for initialization
- Enhanced error messages for LLM service type validation
v0.0.7
Added
- New
transition_to
field for static flows- Combines function handlers with state transitions
- Supports all LLM providers (OpenAI, Anthropic, Gemini)
- Static examples updated to use this new transition
Changed
- Static flow transitions now use
transition_to
instead of matching function names- Before: Function name had to match target node name
- After: Function explicitly declares target via
transition_to
Fixed
- Duplicate LLM responses during transitions
v0.0.6
Added
- New FlowManager supporting both static and dynamic conversation flows
- Provider-specific examples demonstrating dynamic flows:
- OpenAI:
insurance_openai.py
- Anthropic:
insurance_anthropic.py
- Gemini:
insurance_gemini.py
- OpenAI:
- Type safety improvements:
FlowArgs
: Type-safe function argumentsFlowResult
: Type-safe function returns
Changed
- Simplified function handling:
- Automatic LLM function registration
- Optional handlers for edge nodes
- Updated all examples to use unified FlowManager interface
v0.0.5
Added
-
Added LLM support for:
- Anthropic
- Google Gemini
-
Added
LLMFormatParser
, a format parser to handle LLM provider-specific
messages and function call formats -
Added new examples:
movie_explorer_anthropic.py
(Claude 3.5)movie_explorer_gemini.py
(Gemini 1.5 Flash)travel_planner_gemini.py
(Gemini 1.5 Flash)