Skip to content

Using Ollama

Griffen Fargo edited this page Mar 20, 2024 · 1 revision

Simple Setup

The easiest way to use Ollama out of the box with coco is to add or update the service option in the coco config to be ollama.

Alias Definitions

The default service configuration objects connected to the shorthand aliases of openai and ollama can be found here.

Example Project Config using Service Alias

{
  "$schema": "https://git-co.co/schema.json",
  "service": "ollama",
  "verbose": true,
  "ignoredFiles": [
    "package-lock.json"
  ], 
  "ignoredExtensions": [
    "map",
    "lock"
  ]
}

Full Customization

If you want to go deeper than shorthand aliases of openai and ollama and tinker with the underlying model, temperature, endpoint, or other settings available in the OpenAILLMService or OllamaLLMService interfaces.

Example of Service Object

Clone this wiki locally