Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add a means to optionally defer function call completion #970

Merged
merged 5 commits into from
Jan 13, 2025

Conversation

markbackman
Copy link
Contributor

Please describe the changes in your PR. If it is addressing an issue, please reference that as well.

This was motivated by the following Flows issue: pipecat-ai/pipecat-flows#67

The issue is that two consecutive completions occurs in Flows: one for the function call completing and one for the subsequent node's message. This change adds the flexibility to defer the completion for a function call. For the Flows case, this will allow a completion to occur after both the function call and the node's message are added to the context.

This maintains backwards compatibility and attempts to avoid complexity. I'm open to suggestions of different ways to accomplish this. In my testing, this works for Flows and maintains backwards compatibility.

@markbackman markbackman changed the title Added an override_run_llm option to optionally defer function call completion Add an override_run_llm option to optionally defer function call completion Jan 11, 2025
@markbackman markbackman force-pushed the mb/user-controlled-run-llm branch from 2828f7c to 1ca6ecc Compare January 13, 2025 14:49
@aconchillo
Copy link
Contributor

LGTM!

@markbackman markbackman changed the title Add an override_run_llm option to optionally defer function call completion Add a means to optionally defer function call completion Jan 13, 2025
@markbackman markbackman merged commit 98e80b7 into main Jan 13, 2025
4 checks passed
@markbackman markbackman deleted the mb/user-controlled-run-llm branch January 13, 2025 23:48
@hou-man
Copy link

hou-man commented Jan 22, 2025

Hey @markbackman thanks for pushing this improvement. Quick feedback that it seems to break function calling for openai realtime pipelines:

                self._function_call_result = None
run_llm
2025-01-21 17:01:44.375 | ERROR    | pipecat.services.openai_realtime_beta.context:_push_aggregation:223 - Error processing frame: 'FunctionCallResultFrame' object has no attribute 'run_llm'

@markbackman
Copy link
Contributor Author

@hou-man thanks, you're right. I created #1063. We'll fix this in the next release.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants