-
Notifications
You must be signed in to change notification settings - Fork 807
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Does not work with local models #140
Comments
same here |
Uncheck Use Tool Calls in Content and Use vision, setting Max Actions per Step=1. That works for me to use qwen2.5:7b, but the result is poor, try bigger model |
I'm trying to use deepseek:7b , as it seems competent enough for basic tasks. It simply won't appear in the ollama llm drop down list in the web ui. Is there any guide on setting up local llms with this webui? The ollama server is running and I can chat with the llm in the terminal but it's not being seen by the web ui |
You need to write the model name manually in "Model Name", the drop-down menu is also a text input. |
In my case I get tons of |
@imonedesign I tried that. Has it worked for you? |
Yes but the model took forever on the first step of the prompt so I had to switch to OpenAi API for now |
Trying to use local model with ollama (hf.co/unsloth/DeepSeek-R1-Distill-Llama-70B-GGUF:Q4_K_M) Get the following error: File "/home/dan/web-ui/webui.py", line 352, in run_custom_agent Any suggestions? Thanks |
@dhierholzer i solved by manually remove that if statement |
Has anyone solved this error? Trying to run Deepseel-r1:latest (locally using Ollama) and it keeps throwing this error. I have unchecked both vision & Tool Calls in Content |
update codes please |
Uploaded two models.
Deepseek and Qwen2.5
Doesn't want to work with them
The text was updated successfully, but these errors were encountered: