Is it possible to access the Ollama API #334
Replies: 2 comments 6 replies
-
Hi yes I just checked and it works! What you gotta do is go to Then on the Continue menu go to If you have any other questions feel free to ask! |
Beta Was this translation helpful? Give feedback.
-
Exciting to see that this works. Has anyone tried it with the emacs chatbots, like Ellama or gptel? I have just started tinkering and haven't gotten either working, but I haven't tried all the configuration options yet. |
Beta Was this translation helpful? Give feedback.
-
There are some VSCode plugins that can access local LLMs. I'm using Continue.
Is it possible to access the local Ollama / Alpaca API so that extensions can use?
Beta Was this translation helpful? Give feedback.
All reactions