Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Does not work with local models #140

Open
GeroinEXP opened this issue Jan 24, 2025 · 11 comments
Open

Does not work with local models #140

GeroinEXP opened this issue Jan 24, 2025 · 11 comments

Comments

@GeroinEXP
Copy link

Uploaded two models.
Deepseek and Qwen2.5
Doesn't want to work with them
Image

@cyberpsilosis
Copy link

same here

@warmshao
Copy link
Collaborator

Uncheck Use Tool Calls in Content and Use vision, setting Max Actions per Step=1. That works for me to use qwen2.5:7b, but the result is poor, try bigger model

@cyberpsilosis
Copy link

I'm trying to use deepseek:7b , as it seems competent enough for basic tasks.

It simply won't appear in the ollama llm drop down list in the web ui. Is there any guide on setting up local llms with this webui? The ollama server is running and I can chat with the llm in the terminal but it's not being seen by the web ui

@imonedesign
Copy link

You need to write the model name manually in "Model Name", the drop-down menu is also a text input.

@kevinrodriguez-io
Copy link

In my case I get tons of Error: received prediction-error even with ngrok setup to forward my lm studio server.

@cyberpsilosis
Copy link

@imonedesign I tried that. Has it worked for you?

@imonedesign
Copy link

@imonedesign I tried that. Has it worked for you?

Yes but the model took forever on the first step of the prompt so I had to switch to OpenAi API for now

@dhierholzer
Copy link

Trying to use local model with ollama (hf.co/unsloth/DeepSeek-R1-Distill-Llama-70B-GGUF:Q4_K_M)

Get the following error:

File "/home/dan/web-ui/webui.py", line 352, in run_custom_agent
agent = CustomAgent(
^^^^^^^^^^^^
File "/home/dan/web-ui/src/agent/custom_agent.py", line 92, in init
if self.llm.model_name in ["deepseek-reasoner"]:
^^^^^^^^^^^^^^^^^^^
File "/home/dan/miniconda3/envs/browseruse/lib/python3.12/site-packages/pydantic/main.py", line 891, in getattr
raise AttributeError(f'{type(self).name!r} object has no attribute {item!r}')
AttributeError: 'ChatOllama' object has no attribute 'model_name'. Did you mean: 'model_dump'?
Traceback (most recent call last):

Any suggestions?

Thanks

@giokevin
Copy link

@dhierholzer i solved by manually remove that if statement

@pathanjalisrinivasan
Copy link

Trying to use local model with ollama (hf.co/unsloth/DeepSeek-R1-Distill-Llama-70B-GGUF:Q4_K_M)

Get the following error:

File "/home/dan/web-ui/webui.py", line 352, in run_custom_agent agent = CustomAgent( ^^^^^^^^^^^^ File "/home/dan/web-ui/src/agent/custom_agent.py", line 92, in init if self.llm.model_name in ["deepseek-reasoner"]: ^^^^^^^^^^^^^^^^^^^ File "/home/dan/miniconda3/envs/browseruse/lib/python3.12/site-packages/pydantic/main.py", line 891, in getattr raise AttributeError(f'{type(self).name!r} object has no attribute {item!r}') AttributeError: 'ChatOllama' object has no attribute 'model_name'. Did you mean: 'model_dump'? Traceback (most recent call last):

Any suggestions?

Thanks

Has anyone solved this error? Trying to run Deepseel-r1:latest (locally using Ollama) and it keeps throwing this error. I have unchecked both vision & Tool Calls in Content

@vvincent1234
Copy link
Contributor

Trying to use local model with ollama (hf.co/unsloth/DeepSeek-R1-Distill-Llama-70B-GGUF:Q4_K_M)
Get the following error:
File "/home/dan/web-ui/webui.py", line 352, in run_custom_agent agent = CustomAgent( ^^^^^^^^^^^^ File "/home/dan/web-ui/src/agent/custom_agent.py", line 92, in init if self.llm.model_name in ["deepseek-reasoner"]: ^^^^^^^^^^^^^^^^^^^ File "/home/dan/miniconda3/envs/browseruse/lib/python3.12/site-packages/pydantic/main.py", line 891, in getattr raise AttributeError(f'{type(self).name!r} object has no attribute {item!r}') AttributeError: 'ChatOllama' object has no attribute 'model_name'. Did you mean: 'model_dump'? Traceback (most recent call last):
Any suggestions?
Thanks

Has anyone solved this error? Trying to run Deepseel-r1:latest (locally using Ollama) and it keeps throwing this error. I have unchecked both vision & Tool Calls in Content

update codes please

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

10 participants