-
Notifications
You must be signed in to change notification settings - Fork 23
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
LLM sends a message before state transition completes #67
Comments
|
I noticed this too. It seems to be a race condition. The problem is that |
I just noticed that So to get this to work as intended (the context includes the function call and result and THEN the new system prompt), it would require adding a feature to |
Thanks for flagging this and apologies for the silence. Some of our team has been out on vacation still and I've been catching up on core Pipecat issues from the last few weeks. I've noticed this as well. Sometimes, it's helpful for the LLM to generate a response from the function call results, but other times, it's not. This should be a configurable parameter within the function call. That is, we should add a function that indicates if the function call should generate a completion. This will be a core Pipecat change. I'm going to look into this issue to see what our options are. To note, this is not a bug. This is the desired behavior for many cases. But, for Flows, the whole point is that you're in control of the conversation, so this is another behavior that should be under control. |
Thanks Mark. I was going to open up a sister issue in pipecat core as a feature request, but I wasn't sure if it was something that would be welcome just for the sake of pipecat-flows mostly. |
Here's the Pipecat change: pipecat-ai/pipecat#970. I have this working locally for dynamic flows. I still need to make sure everything works as expected for static flows. I'll post something when I have it. |
Ha. I was hoping to submit my first pipecat PR but you beat me to it. I took a slightly different approach here: pipecat-ai/pipecat@main...captaincaius:pipecat:feat-bypass-run-llm It doesn't allow a given registered function to sometimes run_llm and sometimes not, but since you mentioned in your PR that you're open to other options, I figured I might as well share it in case you prefer not having a special key in the function call result or some other tradeoff. |
Thanks for sharing. I think your approach could be a good one too. I'll talk it over with my team tomorrow to see what they prefer. Either way, I think we're on track to get Flows working very predictably after we get these changes in. |
Sweet! I'm super stoked either way. Thanks again for all you folks' hard work! |
Ok! I paired with Aleix and we improved the Pipecat core change to make it easier to build with. With that change, here is the Pipecat Flows change: #75. I've tested for the different LLM providers, both static and dynamic and the timing works well in all cases! Note that this is a breaking change, but a minor one in that you only need to pass the context_aggregator to the FlowManager. |
PR #75 has been merged. This will be in the next Flows release, which I'll cut at some point today or early tomorrow. |
Current Setup
Interview Bot with Dynamic flow handling
initial node (greet user) -> call start_interview function (only a print statement here) -> call transition function (picks question to be asked from flow manager state) -> create and set node to question_node (specifies the question to be asked in LLM context)
Expected Functionality
Actual Behaviour
As per my understanding, LLM sends a messages before the new node context is updated. Let me know if any code is needed from my end. I use the same format as in the dynamic flow example.
UPDATE: I added a 1 second sleep after setting the node and before the transition function ends, it works fine now.
The text was updated successfully, but these errors were encountered: