-
Notifications
You must be signed in to change notification settings - Fork 142
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
0.1.x (Use tools instead of functions) #31
base: main
Are you sure you want to change the base?
Conversation
jekalmin
commented
Nov 19, 2023
- Merged PR(Use tool_calls instead of functions #25) into 0.1.x for releasing 0.1.0-beta1
- Need to merge into main branch when API is stable
Use tool_calls instead of functions
@rkistner |
Hi! if using tools instead of functions, does this prompt need to be changed or the functions need to removed? thanks currently running the beta and would like to test. thanks |
Thanks for your interest! I released this in 1.0.2-beta1. Please try it and give a feedback. |
Thank you! I wanted to share that I've been experimenting with Beta1 using LocalAI. With LocalAI, I have the Mixtral 8x7b (v 2.7) 6Q GGUF model setup, which is supposedly one of the best models out right now. I pointed your integration to this model and then toggled the "use tools" button and pressed submit. I was pleasantly surprised that by doing the above, it was able to read sensor information in my home assistant quite well with no errors. However, when I asked it to turn on a light, it seems to have gone through the actions and acknowledged that it's on, but the fact is it didn't actually turn on the light. So I think using tools is moving in the right direction. However, there might be some tweaks that are required for it to actually do service calls. Do you know if that's something that can be done in the prompt? Thank you! |
As always, thanks @Anto79-ops for your cooperation. |
I can check if there's a way to look at the logs of LocalAI... while I send the command to turn off the light. Is that something that would be useful or do you need the home assistant integration logs? |
I just wanted to know if function is called in message history log. |
fantastic. You'll be very suprised how well it works, Here is the model im using, Q6_K version https://huggingface.co/TheBloke/dolphin-2.7-mixtral-8x7b-GGUF its not a small model, so it may 20 to 40 seconds to reply if you don't have a decent CPU/GPU computer. Let me know how it goes! |
LocalAI does not support function calling right now, you need to instruct your model to generate functions and parse the output. This integration relies on the response from the openai api having the is_function_call value set, localAI models are not trained to perform this. I am investigating integration with: https://github.com/MeetKai/functionary which combined with their special VLLM server seems to be promising in its responses - but it's weak at general responses, so you really need multiple models.. tricky tricky. |
@ex10ded Thanks for your comments. Have you been able to get the functionary v2 gguf model to work with LocalAI? It seems to require a special chat template if you use gguf (not vLLM): |
No, I hit the same issue as you - the template does not seem to work when using anything other than their special vLLM server (not even standard vllm) - they seem to do a lot of pre-processing of the |
I'm on 1.0.3 and for some reason it works when use tools off but when I turn it on it seems to REALLY want to use execute_services for getting states.. This is when using default model and prompt. |
I got localAI working with this here addon, but the LLM seems to start looping on function call. Anyone encountered this? |