-
Notifications
You must be signed in to change notification settings - Fork 358
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
does it support ollama
?
#15
Comments
I was wondering this also and had a look, the code has quite a few places with hardcoded API base URLs etc.., I had a little hack and got it working with local LLMs here just as an experiment - https://github.com/sammcj/MoA |
It does -- Just change the api endoint in utils.py and the model names in bot.py. |
I hope this should be programmable to use custom api endpoint rather than together's API endpoint. |
I iterated on sammcj's fork. It should work with any OpenAI formatted endpoint, including local models. I have tested LM Studio and Groq. Just create your own .env file based on the .env.template file, and it should work with Ollama. https://github.com/erik-sv/MoA |
No description provided.
The text was updated successfully, but these errors were encountered: