Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(ollama): support for running the local Ollama binary #2908

Open
wants to merge 36 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from 9 commits
Commits
Show all changes
36 commits
Select commit Hold shift + click to select a range
20c58a2
feat: support running ollama from the local binary
mdelapenya Dec 3, 2024
9aa0f34
fix: wrong working dir at CI
mdelapenya Dec 3, 2024
d6d5bfb
chore: extract wait to a function
mdelapenya Dec 3, 2024
7b3be54
chore: print local binary logs on error
mdelapenya Dec 3, 2024
4761fa1
chore: remove debug logs
mdelapenya Dec 3, 2024
15d829b
fix(ci): kill ollama before the tests
mdelapenya Dec 3, 2024
15784af
chore: stop ollama using systemctl
mdelapenya Dec 3, 2024
df12237
chore: support setting log file from the env
mdelapenya Dec 4, 2024
8228583
chore: support running ollama commands, only
mdelapenya Dec 4, 2024
0bebaea
Merge branch 'main' into ollama-local
mdelapenya Dec 13, 2024
00936c3
fix: release lock on error
mdelapenya Dec 13, 2024
ee30a02
chore: add more test coverage for the option
mdelapenya Dec 13, 2024
5b0e8c2
chore: simplify useLocal checks
mdelapenya Dec 13, 2024
e16fc00
chore: simpolify
mdelapenya Dec 13, 2024
0fc2a21
chore: pass context to runLocal
mdelapenya Dec 13, 2024
58a46b4
chore: move ctx to the right scope
mdelapenya Dec 13, 2024
01c560d
chore: remove not needed
mdelapenya Dec 13, 2024
7621298
chore: use a container function
mdelapenya Dec 13, 2024
98ecae9
chore: support reading OLLAMA_HOST
mdelapenya Dec 13, 2024
644278f
chore: return error with copy APIs
mdelapenya Dec 13, 2024
25f7c56
chore: simply execute the script
mdelapenya Dec 13, 2024
0e6c3d0
chore: simplify var initialisation
mdelapenya Dec 13, 2024
cca2761
chore: return nil
mdelapenya Dec 13, 2024
5c5058e
fix: return errors on terminate
mdelapenya Dec 13, 2024
857a378
chore: remove options type
mdelapenya Dec 13, 2024
ccd1974
chore: use a map
mdelapenya Dec 13, 2024
eab5fb2
chor: simplify error on wait
mdelapenya Dec 13, 2024
ddc96b4
chore: wrap start logic around the localContext
mdelapenya Dec 16, 2024
299e514
chor: fold
mdelapenya Dec 16, 2024
6ab96ae
chore: merge wait into start
mdelapenya Dec 16, 2024
5cdeb2d
fix: use proper ContainersState
mdelapenya Dec 16, 2024
a8824c0
fix: remove extra conversion
mdelapenya Dec 16, 2024
953518e
chore: handle remove log file errors properly
mdelapenya Dec 16, 2024
1a2ec6b
chore: go back to string in env vars
mdelapenya Dec 16, 2024
f2f9867
fix: lint
mdelapenya Dec 16, 2024
158dc2e
fix: set logFile to nil on terminate
mdelapenya Dec 16, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 6 additions & 0 deletions .github/scripts/modules/ollama/install-dependencies.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
#!/usr/bin/env bash
mdelapenya marked this conversation as resolved.
Show resolved Hide resolved

curl -fsSL https://ollama.com/install.sh | sh

# kill any running ollama process so that the tests can start from a clean state
sudo systemctl stop ollama.service
10 changes: 10 additions & 0 deletions .github/workflows/ci-test-go.yml
Original file line number Diff line number Diff line change
Expand Up @@ -109,6 +109,16 @@ jobs:
working-directory: ./${{ inputs.project-directory }}
run: go build

- name: Install dependencies
shell: bash
run: |
SCRIPT_PATH="./.github/scripts/${{ inputs.project-directory }}/install-dependencies.sh"
stevenh marked this conversation as resolved.
Show resolved Hide resolved
if [ -f "$SCRIPT_PATH" ]; then
bash "$SCRIPT_PATH"
mdelapenya marked this conversation as resolved.
Show resolved Hide resolved
else
echo "No dependencies script found at $SCRIPT_PATH - skipping installation"
fi

- name: go test
# only run tests on linux, there are a number of things that won't allow the tests to run on anything else
# many (maybe, all?) images used can only be build on Linux, they don't have Windows in their manifest, and
Expand Down
45 changes: 45 additions & 0 deletions docs/modules/ollama.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,10 +16,15 @@ go get github.com/testcontainers/testcontainers-go/modules/ollama

## Usage example

The module allows you to run the Ollama container or the local Ollama binary.

<!--codeinclude-->
[Creating a Ollama container](../../modules/ollama/examples_test.go) inside_block:runOllamaContainer
[Running the local Ollama binary](../../modules/ollama/examples_test.go) inside_block:localOllama
<!--/codeinclude-->

If the local Ollama binary fails to execute, the module will fallback to the container version of Ollama.

## Module Reference

### Run function
Expand Down Expand Up @@ -48,6 +53,46 @@ When starting the Ollama container, you can pass options in a variadic way to co
If you need to set a different Ollama Docker image, you can set a valid Docker image as the second argument in the `Run` function.
E.g. `Run(context.Background(), "ollama/ollama:0.1.25")`.

#### Use Local

- Not available until the next release of testcontainers-go <a href="https://github.com/testcontainers/testcontainers-go"><span class="tc-version">:material-tag: main</span></a>

!!!warning
Please make sure the local Ollama binary is not running when using the local version of the module:
Ollama can be started as a system service, or as part of the Ollama application,
and interacting with the logs of a running Ollama process not managed by the module is not supported.

If you need to run the local Ollama binary, you can set the `UseLocal` option in the `Run` function.
This option accepts a list of environment variables as a string, that will be applied to the Ollama binary when executing commands.

E.g. `Run(context.Background(), "ollama/ollama:0.1.25", WithUseLocal("OLLAMA_DEBUG=true"))`.

All the container methods are available when using the local Ollama binary, but will be executed locally instead of inside the container.
Please consider the following differences when using the local Ollama binary:

- The local Ollama binary will create a log file in the current working directory, identified by the session ID. E.g. `local-ollama-<session-id>.log`. It's possible to set the log file name using the `OLLAMA_LOGFILE` environment variable. So if you're running Ollama yourself, from the Ollama app, or the standalone binary, you could use this environment variable to set the same log file name.
- For the Ollama app, the default log file resides in the `$HOME/.ollama/logs/server.log`.
- For the standalone binary, you should start it redirecting the logs to a file. E.g. `ollama serve > /tmp/ollama.log 2>&1`.
- `ConnectionString` returns the connection string to connect to the local Ollama binary started by the module instead of the container, which maps to `127.0.0.1:11434`.
- `ContainerIP` returns `127.0.0.1`.
- `ContainerIPs` returns `["127.0.0.1"]`.
- `CopyToContainer`, `CopyDirToContainer`, `CopyFileToContainer` and `CopyFileFromContainer` don't perform any action.
- `GetLogProductionErrorChannel` returns a nil channel.
- `Endpoint` returns the endpoint to connect to the local Ollama binary started by the module instead of the container, which maps to `127.0.0.1:11434`.
- `Exec` passes the command to the local Ollama binary started by the module instead of inside the container. First argument is the command to execute, and the second argument is the list of arguments, else, an error is returned.
- `GetContainerID` returns the container ID of the local Ollama binary started by the module instead of the container, which maps to `local-ollama-<session-id>`.
- `Host` returns `127.0.0.1`.
- `Inspect` returns a ContainerJSON with the state of the local Ollama binary started by the module.
- `IsRunning` returns true if the local Ollama binary process started by the module is running.
- `Logs` returns the logs from the local Ollama binary started by the module instead of the container.
- `MappedPort` returns the port mapping for the local Ollama binary started by the module instead of the container.
- `Start` starts the local Ollama binary process.
- `State` returns the current state of the local Ollama binary process, `stopped` or `running`.
- `Stop` stops the local Ollama binary process.
- `Terminate` calls the `Stop` method and then removes the log file.

The local Ollama binary will create a log file in the current working directory, and it will be available in the container's `Logs` method.

{% include "../features/common_functional_options.md" %}

### Container Methods
Expand Down
70 changes: 70 additions & 0 deletions modules/ollama/examples_test.go
Original file line number Diff line number Diff line change
Expand Up @@ -173,3 +173,73 @@ func ExampleRun_withModel_llama2_langchain() {

// Intentionally not asserting the output, as we don't want to run this example in the tests.
}

func ExampleRun_withLocal() {
ctx := context.Background()

// localOllama {
ollamaContainer, err := tcollama.Run(ctx, "ollama/ollama:0.3.13", tcollama.WithUseLocal("OLLAMA_DEBUG=true"))
defer func() {
if err := testcontainers.TerminateContainer(ollamaContainer); err != nil {
log.Printf("failed to terminate container: %s", err)
}
}()
if err != nil {
log.Printf("failed to start container: %s", err)
return
}
// }

model := "llama3.2:1b"

_, _, err = ollamaContainer.Exec(ctx, []string{"ollama", "pull", model})
if err != nil {
log.Printf("failed to pull model %s: %s", model, err)
return
}

_, _, err = ollamaContainer.Exec(ctx, []string{"ollama", "run", model})
if err != nil {
log.Printf("failed to run model %s: %s", model, err)
return
}

connectionStr, err := ollamaContainer.ConnectionString(ctx)
if err != nil {
log.Printf("failed to get connection string: %s", err)
return
}

var llm *langchainollama.LLM
if llm, err = langchainollama.New(
langchainollama.WithModel(model),
langchainollama.WithServerURL(connectionStr),
); err != nil {
log.Printf("failed to create langchain ollama: %s", err)
return
}

completion, err := llm.Call(
context.Background(),
"how can Testcontainers help with testing?",
llms.WithSeed(42), // the lower the seed, the more deterministic the completion
llms.WithTemperature(0.0), // the lower the temperature, the more creative the completion
)
if err != nil {
log.Printf("failed to create langchain ollama: %s", err)
return
}

words := []string{
"easy", "isolation", "consistency",
}
lwCompletion := strings.ToLower(completion)

for _, word := range words {
if strings.Contains(lwCompletion, word) {
fmt.Println(true)
}
}

// Intentionally not asserting the output, as we don't want to run this example in the tests.
}
2 changes: 1 addition & 1 deletion modules/ollama/go.mod
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@ go 1.22

require (
github.com/docker/docker v27.1.1+incompatible
github.com/docker/go-connections v0.5.0
github.com/google/uuid v1.6.0
github.com/stretchr/testify v1.9.0
github.com/testcontainers/testcontainers-go v0.34.0
Expand All @@ -22,7 +23,6 @@ require (
github.com/davecgh/go-spew v1.1.1 // indirect
github.com/distribution/reference v0.6.0 // indirect
github.com/dlclark/regexp2 v1.8.1 // indirect
github.com/docker/go-connections v0.5.0 // indirect
github.com/docker/go-units v0.5.0 // indirect
github.com/felixge/httpsnoop v1.0.4 // indirect
github.com/go-logr/logr v1.4.1 // indirect
Expand Down
Loading
Loading