-
-
Notifications
You must be signed in to change notification settings - Fork 480
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG]: 不能识别本地局域网内的ollama链接 #800
Labels
Comments
我也遇到了相同的问题,我使用tailscale配合ollama插件就无法识别(开关流式输出都没用),但如果把ollama插件换成openai插件就可以识别(但openai插件不能打开流式输出,否则也报错)。 |
2.7.10 ,4个ollama按钮均无法链接本地ollama |
这种描述太笼统了,请提供更加详细的问题描述或日志内容 |
通过别的issue了解到open ai的流式输出会牵扯到跨域请求的问题,通过本地端口转发到服务器端口,然后pot中open ai服务地址设定为本地127.0.0.1+端口即可正常访问 |
添加openai服务指向非本地的ollama api也可以work,但是添加ollama服务则失败 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Description
http://192.168.101.19:11434
gemma:7b-instruct-fp16
连不上ollama
Reproduction
ollama部署在另外一台linux机器
Platform
Windows
System Version
window11
Window System (Linux Only)
None
Software Version
2.7
Log File
No response
Additional Information
No response
The text was updated successfully, but these errors were encountered: