Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ollama support #207

Open
DaveChini opened this issue Apr 26, 2024 · 3 comments
Open

Ollama support #207

DaveChini opened this issue Apr 26, 2024 · 3 comments

Comments

@DaveChini
Copy link

The ollama api should be callable using the OpenAIApi and just changing the API Host to point to the ollama serv location, but ollama streams by default and i think this is causing it to fail with Glarity. As far as i know there is no way to push custom parameters along with the api call.

@razvanab
Copy link

Please add Ollama.

@uniquMonte
Copy link

Please add Ollama too, thans

@realgooseman
Copy link

Hi @sparticleinc, any updates on this ? Is a fully local and private experience with Ollama even considered ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants