Skip to content

Commit

Permalink
feat: support running ollama from the local binary
Browse files Browse the repository at this point in the history
  • Loading branch information
mdelapenya committed Dec 3, 2024
1 parent 4d0f7b7 commit 20c58a2
Show file tree
Hide file tree
Showing 10 changed files with 948 additions and 2 deletions.
3 changes: 3 additions & 0 deletions .github/scripts/modules/ollama/install-dependencies.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
#!/usr/bin/env bash

curl -fsSL https://ollama.com/install.sh | sh
11 changes: 11 additions & 0 deletions .github/workflows/ci-test-go.yml
Original file line number Diff line number Diff line change
Expand Up @@ -109,6 +109,17 @@ jobs:
working-directory: ./${{ inputs.project-directory }}
run: go build

- name: Install dependencies
working-directory: ./${{ inputs.project-directory }}
shell: bash
run: |
SCRIPT_PATH="./.github/scripts/${{ inputs.project-directory }}/install-dependencies.sh"
if [ -f "$SCRIPT_PATH" ]; then
bash "$SCRIPT_PATH"
else
echo "No dependencies script found at $SCRIPT_PATH - skipping installation"
fi
- name: go test
# only run tests on linux, there are a number of things that won't allow the tests to run on anything else
# many (maybe, all?) images used can only be build on Linux, they don't have Windows in their manifest, and
Expand Down
38 changes: 38 additions & 0 deletions docs/modules/ollama.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,10 +16,15 @@ go get github.com/testcontainers/testcontainers-go/modules/ollama

## Usage example

The module allows you to run the Ollama container or the local Ollama binary.

<!--codeinclude-->
[Creating a Ollama container](../../modules/ollama/examples_test.go) inside_block:runOllamaContainer
[Running the local Ollama binary](../../modules/ollama/examples_test.go) inside_block:localOllama
<!--/codeinclude-->

If the local Ollama binary fails to execute, the module will fallback to the container version of Ollama.

## Module Reference

### Run function
Expand Down Expand Up @@ -48,6 +53,39 @@ When starting the Ollama container, you can pass options in a variadic way to co
If you need to set a different Ollama Docker image, you can set a valid Docker image as the second argument in the `Run` function.
E.g. `Run(context.Background(), "ollama/ollama:0.1.25")`.

#### Use Local

- Not available until the next release of testcontainers-go <a href="https://github.com/testcontainers/testcontainers-go"><span class="tc-version">:material-tag: main</span></a>

If you need to run the local Ollama binary, you can set the `UseLocal` option in the `Run` function.
This option accepts a list of environment variables as a string, that will be applied to the Ollama binary when executing commands.

E.g. `Run(context.Background(), "ollama/ollama:0.1.25", WithUseLocal("OLLAMA_DEBUG=true"))`.

All the container methods are available when using the local Ollama binary, but will be executed locally instead of inside the container.
Please consider the following differences when using the local Ollama binary:

- The local Ollama binary will create a log file in the current working directory, identified by the session ID. E.g. `local-ollama-<session-id>.log`.
- `ConnectionString` returns the connection string to connect to the local Ollama binary instead of the container, which maps to `127.0.0.1:11434`.
- `ContainerIP` returns `127.0.0.1`.
- `ContainerIPs` returns `["127.0.0.1"]`.
- `CopyToContainer`, `CopyDirToContainer`, `CopyFileToContainer` and `CopyFileFromContainer` don't perform any action.
- `GetLogProductionErrorChannel` returns a nil channel.
- `Endpoint` returns the endpoint to connect to the local Ollama binary instead of the container, which maps to `127.0.0.1:11434`.
- `Exec` passes the command to the local Ollama binary instead of inside the container. First argument is the command to execute, and the second argument is the list of arguments.
- `GetContainerID` returns the container ID of the local Ollama binary instead of the container, which maps to `local-ollama-<session-id>`.
- `Host` returns `127.0.0.1`.
- `Inspect` returns a ContainerJSON with the state of the local Ollama binary.
- `IsRunning` returns true if the local Ollama binary process is running.
- `Logs` returns the logs from the local Ollama binary instead of the container.
- `MappedPort` returns the port mapping for the local Ollama binary instead of the container.
- `Start` starts the local Ollama binary process.
- `State` returns the current state of the local Ollama binary process, `stopped` or `running`.
- `Stop` stops the local Ollama binary process.
- `Terminate` calls the `Stop` method and then removes the log file.

The local Ollama binary will create a log file in the current working directory, and it will be available in the container's `Logs` method.

{% include "../features/common_functional_options.md" %}

### Container Methods
Expand Down
70 changes: 70 additions & 0 deletions modules/ollama/examples_test.go
Original file line number Diff line number Diff line change
Expand Up @@ -173,3 +173,73 @@ func ExampleRun_withModel_llama2_langchain() {

// Intentionally not asserting the output, as we don't want to run this example in the tests.
}

func ExampleRun_withLocal() {
ctx := context.Background()

// localOllama {
ollamaContainer, err := tcollama.Run(ctx, "ollama/ollama:0.3.13", tcollama.WithUseLocal("OLLAMA_DEBUG=true"))
defer func() {
if err := testcontainers.TerminateContainer(ollamaContainer); err != nil {
log.Printf("failed to terminate container: %s", err)
}
}()
if err != nil {
log.Printf("failed to start container: %s", err)
return
}
// }

model := "llama3.2:1b"

_, _, err = ollamaContainer.Exec(ctx, []string{"ollama", "pull", model})
if err != nil {
log.Printf("failed to pull model %s: %s", model, err)
return
}

_, _, err = ollamaContainer.Exec(ctx, []string{"ollama", "run", model})
if err != nil {
log.Printf("failed to run model %s: %s", model, err)
return
}

connectionStr, err := ollamaContainer.ConnectionString(ctx)
if err != nil {
log.Printf("failed to get connection string: %s", err)
return
}

var llm *langchainollama.LLM
if llm, err = langchainollama.New(
langchainollama.WithModel(model),
langchainollama.WithServerURL(connectionStr),
); err != nil {
log.Printf("failed to create langchain ollama: %s", err)
return
}

completion, err := llm.Call(
context.Background(),
"how can Testcontainers help with testing?",
llms.WithSeed(42), // the lower the seed, the more deterministic the completion
llms.WithTemperature(0.0), // the lower the temperature, the more creative the completion
)
if err != nil {
log.Printf("failed to create langchain ollama: %s", err)
return
}

words := []string{
"easy", "isolation", "consistency",
}
lwCompletion := strings.ToLower(completion)

for _, word := range words {
if strings.Contains(lwCompletion, word) {
fmt.Println(true)
}
}

// Intentionally not asserting the output, as we don't want to run this example in the tests.
}
2 changes: 1 addition & 1 deletion modules/ollama/go.mod
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@ go 1.22

require (
github.com/docker/docker v27.1.1+incompatible
github.com/docker/go-connections v0.5.0
github.com/google/uuid v1.6.0
github.com/stretchr/testify v1.9.0
github.com/testcontainers/testcontainers-go v0.34.0
Expand All @@ -22,7 +23,6 @@ require (
github.com/davecgh/go-spew v1.1.1 // indirect
github.com/distribution/reference v0.6.0 // indirect
github.com/dlclark/regexp2 v1.8.1 // indirect
github.com/docker/go-connections v0.5.0 // indirect
github.com/docker/go-units v0.5.0 // indirect
github.com/felixge/httpsnoop v1.0.4 // indirect
github.com/go-logr/logr v1.4.1 // indirect
Expand Down
Loading

0 comments on commit 20c58a2

Please sign in to comment.