-
Notifications
You must be signed in to change notification settings - Fork 142
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Open WebUI: Unexpected error during intent recognition #273
Comments
I follow cause I receive same error when I try to use the n8n backend. |
@guilfoos I have this extension working with my local openwebui/ollama setup, so thought I would share. Please use the Api base as listed in the following documentation (unrelated but documents the base url that can be used for openai like api endpoints). Hope it works for you, good luck. |
I think I tried every single possibility for the correct api domain for my open-webui instance ... can't get it to work :( Would really want to be able to talk to my own models of Open-Webui in Home Assistant. Visiting my instance /docs works perfect, I can do all api calls in that ui. But just can't get any url to work with extended_openai_conversation. Any other ideas for how to correctly fill in the extended_openai_conversation fields in Home Assistant for this to work? |
in my case works randonmly.. some times |
After setting this aside for a bit, I got my Ollama installation moved to kubernetes, but once again I get to this exact point. I've switched out the JWT token for an API key, but when connecting to https://hostname/api as the endpoint, I consistently get "AttributeError: 'NoneType' object has no attribute 'total_tokens'" from the assist pipeline. The only change I made from the default settings in extended_openui_conversation is to change the model to llama3, which is what I have installed and am testing with. This is an extremely frustrating issue. Turning on debug logging, I can see the prompt (including exposed entities), and the response back from open webui. So the connectivity is good, but perhaps the default prompt is no good? Or the model? I've included the tail end of my prompt (after the exposed devices are
@roshangautam - care to share your prompt? Model choice? |
More reading: this seems to be a function problem. The logs don't indicate the function is being provided, and I've tried to use the "use tools" toggle to have the extension use tools as the function spec in the API has been deprecated. No change. Not sure how to improve my logging here to dig further. |
I have Open WebUI running successfully in Docker, with an Ollama engine behind it. I can log in and use the LLM as expected.
However, when I try to connect via Home Assistant, I've had no end of problems. I've been able to iteratively improve things, but I've hit a roadblock: any input in the Assist pipeline returns "Unexpected error during intent recognition".
The Open WebUI logs seem OK:
The assist pipeline logs, however:
The only change I've made to the service setting is to change the model, everything else is default. The service itself is pointing at "http://docker-host:port/api", and is using the JWT token for the admin account in Open WebUI.
I've mucked around with other settings and have been able to get "method not allowed" errors, and model errors right in the assist output, but I'm kind of at my wits end.
The text was updated successfully, but these errors were encountered: