All notable changes to Pipecat Flows will be documented in this file.
The format is based on Keep a Changelog, and this project adheres to Semantic Versioning.
- Temporarily reverted the deprecation of the
tts
parameter inFlowManager.__init__()
. This feature will be deprecated in a future release after the required Pipecat changes are completed.
- Added context update strategies to control how context is managed during node
transitions:
APPEND
: Add new messages to existing context (default behavior)RESET
: Clear and replace context with new messages and most recent function call resultsRESET_WITH_SUMMARY
: Reset context but include an LLM-generated summary along with the new messages- Strategies can be set globally or per-node
- Includes automatic fallback to RESET if summary generation fails
Example usage:
# Global strategy
flow_manager = FlowManager(
context_strategy=ContextStrategyConfig(
strategy=ContextStrategy.RESET
)
)
# Per-node strategy
node_config = {
"task_messages": [...],
"functions": [...],
"context_strategy": ContextStrategyConfig(
strategy=ContextStrategy.RESET_WITH_SUMMARY,
summary_prompt="Summarize the key points discussed so far."
)
}
- Added a new function called
get_current_context
which provides access to the LLM context.
Example usage:
# Access current conversation context
context = flow_manager.get_current_context()
- Added a new dynamic example called
restaurant_reservation.py
.
-
Transition callbacks now receive function results directly as a second argument:
async def handle_transition(args: Dict, result: FlowResult, flow_manager: FlowManager)
. This enables direct access to typed function results for making routing decisions. For backwards compatibility, the two-argument signature(args: Dict, flow_manager: FlowManager)
is still supported. -
Updated dynamic examples to use the new result argument.
- The
tts
parameter inFlowManager.__init__()
is now deprecated and will be removed in a future version. Thetts_say
action now pushes aTTSSpeakFrame
.
- Support for inline action handlers in flow configuration:
- Actions can now be registered via handler field in config
- Maintains backwards compatibility with manual registration
- Built-in actions (
tts_say
,end_conversation
) work without changes
Example of the new pattern:
"pre_actions": [
{
"type": "check_status",
"handler": check_status_handler
}
]
- Updated dynamic flows to use per-function, inline transition callbacks:
- Removed global
transition_callback
from FlowManager initialization - Transition handlers are now specified directly in function definitions
- Dynamic transitions are now specified similarly to the static flows'
transition_to
field - Breaking change: Dynamic flows must now specify transition callbacks in function configuration
- Removed global
Example of the new pattern:
# Before - global transition callback
flow_manager = FlowManager(
transition_callback=handle_transition
)
# After - inline transition callbacks
def create_node() -> NodeConfig:
return {
"functions": [{
"type": "function",
"function": {
"name": "collect_age",
"handler": collect_age,
"description": "Record user's age",
"parameters": {...},
"transition_callback": handle_age_collection
}
}]
}
- Updated dynamic flow examples to use the new
transition_callback
pattern.
- Fixed an issue where multiple, consecutive function calls could result in two completions.
-
Updated
FlowManager
to more predictably handle function calls:- Edge functions (which transition to a new node) now result in an LLM completion after both the function call and messages are added to the LLM's context.
- Node functions (which execute a function call without transitioning nodes) result in an LLM completion upon the function call result returning.
- This change also improves the reliability of the pre- and post-action execution timing.
-
Breaking changes:
- The FlowManager has a new required arg,
context_aggregator
. - Pipecat's minimum version has been updated to 0.0.53 in order to use the
new
FunctionCallResultProperties
frame.
- The FlowManager has a new required arg,
-
Updated all examples to align with the new changes.
-
Nodes now have two message types to better delineate defining the role or persona of the bot from the task it needs to accomplish. The message types are:
role_messages
, which defines the personality or role of the bottask_messages
, which defines the task to be completed for a given node
-
role_messages
can be defined for the initial node and then inherited by subsequent nodes. You can treat this as an LLM "system" message. -
Simplified FlowManager initialization by removing the need for manual context setup in both static and dynamic flows. Now, you need to create a
FlowManager
and initialize it to start the flow. -
All examples have been updated to align with the API changes.
- Fixed an issue where importing the Flows module would require OpenAI, Anthropic, and Google LLM modules.
- Fixed function handler registration in FlowManager to handle
__function__:
tokens- Previously, the handler string was used directly, causing "not callable" errors
- Now correctly looks up and uses the actual function object from the main module
- Supports both direct function references and function names exported from the Flows editor
- Improved type safety in FlowManager by requiring keyword arguments for initialization
- Enhanced error messages for LLM service type validation
- New
transition_to
field for static flows- Combines function handlers with state transitions
- Supports all LLM providers (OpenAI, Anthropic, Gemini)
- Static examples updated to use this new transition
- Static flow transitions now use
transition_to
instead of matching function names- Before: Function name had to match target node name
- After: Function explicitly declares target via
transition_to
- Duplicate LLM responses during transitions
- New FlowManager supporting both static and dynamic conversation flows
- Provider-specific examples demonstrating dynamic flows:
- OpenAI:
insurance_openai.py
- Anthropic:
insurance_anthropic.py
- Gemini:
insurance_gemini.py
- OpenAI:
- Type safety improvements:
FlowArgs
: Type-safe function argumentsFlowResult
: Type-safe function returns
- Simplified function handling:
- Automatic LLM function registration
- Optional handlers for edge nodes
- Updated all examples to use unified FlowManager interface
-
Added LLM support for:
- Anthropic
- Google Gemini
-
Added
LLMFormatParser
, a format parser to handle LLM provider-specific messages and function call formats -
Added new examples:
- movie_explorer_anthropic.py (Claude 3.5)
- movie_explorer_gemini.py (Gemini 1.5 Flash)
- travel_planner_gemini.py (Gemini 1.5 Flash)
- New example
movie_explorer.py
demonstrating:- Real API integration with TMDB
- Node functions for API calls
- Edge functions for state transitions
- Proper function registration pattern
-
Renamed function types to use graph terminology:
- "Terminal functions" are now "node functions" (operations within a state)
- "Transitional functions" are now "edge functions" (transitions between states)
-
Updated function registration process:
- Node functions must be registered directly with the LLM before flow initialization
- Edge functions are automatically registered by FlowManager during initialization
- LLM instance is now required in FlowManager constructor
-
Added flexibility to node naming with the Editor:
- Start nodes can now use any descriptive name (e.g., "greeting")
- End nodes conventionally use "end" but support custom names
- Flow configuration's
initial_node
property determines the starting state
- All examples updated to use new function registration pattern
- Documentation updated to reflect new terminology and patterns
- Editor updated to support flexible node naming
-
Added an
examples
directory which contains five different examples showing how to build a conversation flow with Pipecat Flows. -
Added a new editor example called
patient_intake.json
which demonstrates a patient intake conversation flow.
pipecat-ai-flows
now includespipecat-ai
as a dependency, making it easier to get started with a fresh installation.
- Fixed an issue where terminal functions were updating the LLM context and tools. Now, only transitional functions update the LLM context and tools.
- Fixed an issue where
pipecat-ai
was mistakenly added as a dependency
-
Initial public beta release.
-
Added conversation flow management system through
FlowState
andFlowManager
classes. This system enables developers to create structured, multi-turn conversations using a node-based state machine. Each node can contain:- Multiple LLM context messages (system/user/assistant)
- Available functions for that state
- Pre- and post-actions for state transitions
- Support for both terminal functions (stay in same node) and transitional functions
- Built-in handlers for immediate TTS feedback and conversation end
-
Added
NodeConfig
dataclass for defining conversation states, supporting:- Multiple messages per node for complex prompt building
- Function definitions for available actions
- Optional pre- and post-action hooks
- Clear separation between node configuration and state management