Skip to content

Commit

Permalink
feat(ollama): support calling the Ollama local process (#2923)
Browse files Browse the repository at this point in the history
* feat: support running ollama from the local binary

* fix: wrong working dir at CI

* chore: extract wait to a function

* chore: print local binary logs on error

* chore: remove debug logs

* fix(ci): kill ollama before the tests

* chore: stop ollama using systemctl

* chore: support setting log file from the env

* chore: support running ollama commands, only

* fix: release lock on error

* chore: add more test coverage for the option

* chore: simplify useLocal checks

* chore: simpolify

* chore: pass context to runLocal

* chore: move ctx to the right scope

* chore: remove not needed

* chore: use a container function

* chore: support reading OLLAMA_HOST

* chore: return error with copy APIs

* chore: simply execute the script

* chore: simplify var initialisation

* chore: return nil

* fix: return errors on terminate

* chore: remove options type

* chore: use a map

* chor: simplify error on wait

* chore: wrap start logic around the localContext

* chor: fold

* chore: merge wait into start

* fix: use proper ContainersState

* fix: remove extra conversion

* chore: handle remove log file errors properly

* chore: go back to string in env vars

* refactor(ollama): local process

Refactor local process handling for Ollama using a container implementation
avoiding the wrapping methods.

This defaults to running the binary with an ephemeral port to avoid port
conflicts. This behaviour can be overridden my setting OLLAMA_HOST either
in the parent environment or in the values passed via WithUseLocal.

Improve API compatibility with:

- Multiplexed output streams
- State reporting
- Exec option processing
- WaitingFor customisation

Fix Container implementation:

- Port management
- Running checks
- Terminate processing
- Endpoint argument definition
- Add missing methods
- Consistent environment handling

* chore(ollama): refactor local to use log sub match.

Refactor local processing to use the new log sub match functionality.

* feat(ollama): validate container request

Validate the container request to ensure the user configuration can be processed
and no fields that would be ignored are present.

* chore(ollama): remove temporary test

Remove temporary simple test.

* feat(ollama): configurable local process binary

Allow the local ollama binary name to be configured using the image name.

* docs(ollama): detail local process supported fields

Detail the container request supported fields.

* docs(ollama): update local process site docs

Update local process site docs to match recent changes.

* chore: refactor to support TerminateOption

Refactor Terminate to support testcontainers.TerminateOption.

* fix: remove unused var

---------

Co-authored-by: Manuel de la Peña <mdelapenya@gmail.com>
  • Loading branch information
stevenh and mdelapenya authored Jan 2, 2025
1 parent 632249a commit 6ec91f1
Show file tree
Hide file tree
Showing 10 changed files with 1,630 additions and 16 deletions.
6 changes: 6 additions & 0 deletions .github/scripts/modules/ollama/install-dependencies.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
#!/usr/bin/env bash

curl -fsSL https://ollama.com/install.sh | sh

# kill any running ollama process so that the tests can start from a clean state
sudo systemctl stop ollama.service
10 changes: 10 additions & 0 deletions .github/workflows/ci-test-go.yml
Original file line number Diff line number Diff line change
Expand Up @@ -107,6 +107,16 @@ jobs:
working-directory: ./${{ inputs.project-directory }}
run: go build

- name: Install dependencies
shell: bash
run: |
SCRIPT_PATH="./.github/scripts/${{ inputs.project-directory }}/install-dependencies.sh"
if [ -f "$SCRIPT_PATH" ]; then
$SCRIPT_PATH
else
echo "No dependencies script found at $SCRIPT_PATH - skipping installation"
fi
- name: go test
# only run tests on linux, there are a number of things that won't allow the tests to run on anything else
# many (maybe, all?) images used can only be build on Linux, they don't have Windows in their manifest, and
Expand Down
50 changes: 50 additions & 0 deletions docs/modules/ollama.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,10 +16,15 @@ go get github.com/testcontainers/testcontainers-go/modules/ollama

## Usage example

The module allows you to run the Ollama container or the local Ollama binary.

<!--codeinclude-->
[Creating a Ollama container](../../modules/ollama/examples_test.go) inside_block:runOllamaContainer
[Running the local Ollama binary](../../modules/ollama/examples_test.go) inside_block:localOllama
<!--/codeinclude-->

If the local Ollama binary fails to execute, the module will fallback to the container version of Ollama.

## Module Reference

### Run function
Expand Down Expand Up @@ -48,6 +53,51 @@ When starting the Ollama container, you can pass options in a variadic way to co
If you need to set a different Ollama Docker image, you can set a valid Docker image as the second argument in the `Run` function.
E.g. `Run(context.Background(), "ollama/ollama:0.1.25")`.

#### Use Local

- Not available until the next release of testcontainers-go <a href="https://github.com/testcontainers/testcontainers-go"><span class="tc-version">:material-tag: main</span></a>

!!!warning
Please make sure the local Ollama binary is not running when using the local version of the module:
Ollama can be started as a system service, or as part of the Ollama application,
and interacting with the logs of a running Ollama process not managed by the module is not supported.

If you need to run the local Ollama binary, you can set the `UseLocal` option in the `Run` function.
This option accepts a list of environment variables as a string, that will be applied to the Ollama binary when executing commands.

E.g. `Run(context.Background(), "ollama/ollama:0.1.25", WithUseLocal("OLLAMA_DEBUG=true"))`.

All the container methods are available when using the local Ollama binary, but will be executed locally instead of inside the container.
Please consider the following differences when using the local Ollama binary:

- The local Ollama binary will create a log file in the current working directory, identified by the session ID. E.g. `local-ollama-<session-id>.log`. It's possible to set the log file name using the `OLLAMA_LOGFILE` environment variable. So if you're running Ollama yourself, from the Ollama app, or the standalone binary, you could use this environment variable to set the same log file name.
- For the Ollama app, the default log file resides in the `$HOME/.ollama/logs/server.log`.
- For the standalone binary, you should start it redirecting the logs to a file. E.g. `ollama serve > /tmp/ollama.log 2>&1`.
- `ConnectionString` returns the connection string to connect to the local Ollama binary started by the module instead of the container.
- `ContainerIP` returns the bound host IP `127.0.0.1` by default.
- `ContainerIPs` returns the bound host IP `["127.0.0.1"]` by default.
- `CopyToContainer`, `CopyDirToContainer`, `CopyFileToContainer` and `CopyFileFromContainer` return an error if called.
- `GetLogProductionErrorChannel` returns a nil channel.
- `Endpoint` returns the endpoint to connect to the local Ollama binary started by the module instead of the container.
- `Exec` passes the command to the local Ollama binary started by the module instead of inside the container. First argument is the command to execute, and the second argument is the list of arguments, else, an error is returned.
- `GetContainerID` returns the container ID of the local Ollama binary started by the module instead of the container, which maps to `local-ollama-<session-id>`.
- `Host` returns the bound host IP `127.0.0.1` by default.
- `Inspect` returns a ContainerJSON with the state of the local Ollama binary started by the module.
- `IsRunning` returns true if the local Ollama binary process started by the module is running.
- `Logs` returns the logs from the local Ollama binary started by the module instead of the container.
- `MappedPort` returns the port mapping for the local Ollama binary started by the module instead of the container.
- `Start` starts the local Ollama binary process.
- `State` returns the current state of the local Ollama binary process, `stopped` or `running`.
- `Stop` stops the local Ollama binary process.
- `Terminate` calls the `Stop` method and then removes the log file.

The local Ollama binary will create a log file in the current working directory, and it will be available in the container's `Logs` method.

!!!info
The local Ollama binary will use the `OLLAMA_HOST` environment variable to set the host and port to listen on.
If the environment variable is not set, it will default to `localhost:0`
which bind to a loopback address on an ephemeral port to avoid port conflicts.

{% include "../features/common_functional_options.md" %}

### Container Methods
Expand Down
70 changes: 70 additions & 0 deletions modules/ollama/examples_test.go
Original file line number Diff line number Diff line change
Expand Up @@ -173,3 +173,73 @@ func ExampleRun_withModel_llama2_langchain() {

// Intentionally not asserting the output, as we don't want to run this example in the tests.
}

func ExampleRun_withLocal() {
ctx := context.Background()

// localOllama {
ollamaContainer, err := tcollama.Run(ctx, "ollama/ollama:0.3.13", tcollama.WithUseLocal("OLLAMA_DEBUG=true"))
defer func() {
if err := testcontainers.TerminateContainer(ollamaContainer); err != nil {
log.Printf("failed to terminate container: %s", err)
}
}()
if err != nil {
log.Printf("failed to start container: %s", err)
return
}
// }

model := "llama3.2:1b"

_, _, err = ollamaContainer.Exec(ctx, []string{"ollama", "pull", model})
if err != nil {
log.Printf("failed to pull model %s: %s", model, err)
return
}

_, _, err = ollamaContainer.Exec(ctx, []string{"ollama", "run", model})
if err != nil {
log.Printf("failed to run model %s: %s", model, err)
return
}

connectionStr, err := ollamaContainer.ConnectionString(ctx)
if err != nil {
log.Printf("failed to get connection string: %s", err)
return
}

var llm *langchainollama.LLM
if llm, err = langchainollama.New(
langchainollama.WithModel(model),
langchainollama.WithServerURL(connectionStr),
); err != nil {
log.Printf("failed to create langchain ollama: %s", err)
return
}

completion, err := llm.Call(
context.Background(),
"how can Testcontainers help with testing?",
llms.WithSeed(42), // the lower the seed, the more deterministic the completion
llms.WithTemperature(0.0), // the lower the temperature, the more creative the completion
)
if err != nil {
log.Printf("failed to create langchain ollama: %s", err)
return
}

words := []string{
"easy", "isolation", "consistency",
}
lwCompletion := strings.ToLower(completion)

for _, word := range words {
if strings.Contains(lwCompletion, word) {
fmt.Println(true)
}
}

// Intentionally not asserting the output, as we don't want to run this example in the tests.
}
2 changes: 1 addition & 1 deletion modules/ollama/go.mod
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@ go 1.22

require (
github.com/docker/docker v27.1.1+incompatible
github.com/docker/go-connections v0.5.0
github.com/google/uuid v1.6.0
github.com/stretchr/testify v1.9.0
github.com/testcontainers/testcontainers-go v0.34.0
Expand All @@ -22,7 +23,6 @@ require (
github.com/davecgh/go-spew v1.1.1 // indirect
github.com/distribution/reference v0.6.0 // indirect
github.com/dlclark/regexp2 v1.8.1 // indirect
github.com/docker/go-connections v0.5.0 // indirect
github.com/docker/go-units v0.5.0 // indirect
github.com/felixge/httpsnoop v1.0.4 // indirect
github.com/go-logr/logr v1.4.1 // indirect
Expand Down
Loading

0 comments on commit 6ec91f1

Please sign in to comment.