Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use another LLM provider but still access ollama #77

Open
rickywu opened this issue Jul 30, 2024 · 1 comment
Open

Use another LLM provider but still access ollama #77

rickywu opened this issue Jul 30, 2024 · 1 comment

Comments

@rickywu
Copy link

rickywu commented Jul 30, 2024

I'm using xinference and changed .env and settings.yaml

but start app.py still got error like this:

Exception while fetching openai_chat models: HTTPConnectionPool(host='localhost', port=11434): Max retries exceeded with url: /v1/models (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f153cccf9b0>: Failed to establish a new connection: [Errno 111] Connection refused'))

.env

LLM_PROVIDER=openai
LLM_API_BASE=http://172.17.22.174:9997/v1
LLM_MODEL='Qwen1.5-14B-Chat-GPTQ-Int4'
LLM_API_KEY=''

EMBEDDINGS_PROVIDER=openai
EMBEDDINGS_API_BASE=http://172.17.22.174:9997/v1
EMBEDDINGS_MODEL='m3e-base'
EMBEDDINGS_API_KEY=''

settings.yaml:

llm:
  api_key: ${GRAPHRAG_API_KEY}
  type: openai_chat # or azure_openai_chat
  model: Qwen1.5-14B-Chat-GPTQ-Int4
  model_supports_json: true # recommended if this is available for your model.
  # max_tokens: 4000
  # request_timeout: 180.0
  api_base: http://172.17.22.174:9997/v1
@Tovi163
Copy link

Tovi163 commented Aug 11, 2024

Exception while fetching openai_chat models: HTTPConnectionPool(host='localhost', port=11434)

default the request point to 127.0.0.1:11434, but your api_base point to "http://172.17.22.174:9997"

  1. make xinference start at 127.0.0.1:11434
  2. change GraphRAG-Local-UI source code, point to http://172.17.22.174:9997

@rickywu

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants