Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG]: 不能识别本地局域网内的ollama链接 #800

Open
wwjCMP opened this issue May 4, 2024 · 6 comments
Open

[BUG]: 不能识别本地局域网内的ollama链接 #800

wwjCMP opened this issue May 4, 2024 · 6 comments

Comments

@wwjCMP
Copy link

wwjCMP commented May 4, 2024

Description

http://192.168.101.19:11434
gemma:7b-instruct-fp16

连不上ollama

Reproduction

ollama部署在另外一台linux机器

Platform

Windows

System Version

window11

Window System (Linux Only)

None

Software Version

2.7

Log File

No response

Additional Information

No response

@xiao-zy19
Copy link

我也遇到了相同的问题,我使用tailscale配合ollama插件就无法识别(开关流式输出都没用),但如果把ollama插件换成openai插件就可以识别(但openai插件不能打开流式输出,否则也报错)。

@futurewin
Copy link

2.7.10 ,4个ollama按钮均无法链接本地ollama

@xtyuns
Copy link
Member

xtyuns commented Jun 1, 2024

连不上ollama

这种描述太笼统了,请提供更加详细的问题描述或日志内容

@or-less
Copy link

or-less commented Jun 11, 2024

通过别的issue了解到open ai的流式输出会牵扯到跨域请求的问题,通过本地端口转发到服务器端口,然后pot中open ai服务地址设定为本地127.0.0.1+端口即可正常访问

@ldwnt
Copy link

ldwnt commented Aug 4, 2024

连不上ollama

这种描述太笼统了,请提供更加详细的问题描述或日志内容
如图,ollama api能调通,但是pot添加服务失败

image

@ldwnt
Copy link

ldwnt commented Aug 4, 2024

通过别的issue了解到open ai的流式输出会牵扯到跨域请求的问题,通过本地端口转发到服务器端口,然后pot中open ai服务地址设定为本地127.0.0.1+端口即可正常访问

添加openai服务指向非本地的ollama api也可以work,但是添加ollama服务则失败

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

6 participants