Skip to content

Integration with Ollama? #221

Answered by brotherkaif
brotherkaif asked this question in Q&A
Discussion options

You must be logged in to vote

Ignore me, I've answered my own question. It seems that you only need to add an entry for the correct model name (in this case, llama2).

The following is a workable config ollama:

apis:
  openai:
    # base-url: https://api.openai.com/v1
    base-url: http://localhost:11434/v1
    api-key: "ignored"
    api-key-env: OPENAI_API_KEY
    models:
      llama2:
        aliases: ["4"]
        max-input-chars: 24500
        fallback: 

I'm going to have a bit more of a play around with this tonight.

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by brotherkaif
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
1 participant