Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Document how to configure to use a self hosted ollama llm #2

Open
GregHilston opened this issue Jul 21, 2024 · 0 comments
Open

Document how to configure to use a self hosted ollama llm #2

GregHilston opened this issue Jul 21, 2024 · 0 comments

Comments

@GregHilston
Copy link

For example, say I'm running ollama on the same machine as this container. Or even another machine, how can J configure this project to use that llm? And skip using Claude and chat gpt

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant