Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Enhancement]: Ollama module #1097

Open
frankhaugen opened this issue Jan 24, 2024 · 5 comments · May be fixed by #1099
Open

[Enhancement]: Ollama module #1097

frankhaugen opened this issue Jan 24, 2024 · 5 comments · May be fixed by #1099
Labels
enhancement New feature or request module An official Testcontainers module

Comments

@frankhaugen
Copy link

Problem

Its not having an Ollama module.

I realize that my POC here needs to have a discussion of design and features, but its a fun litte thing

Solution

Make a module, here's a prototype of Ollama working with Testcontainers in LinqPad 8:

Deps the nugets Testcontainers and OllamaSharp

var container = new ContainerBuilder()
	.WithName("ollama-container")
	.WithImage("ollama/ollama")
	.WithPortBinding(11434, 11434)
	.WithVolumeMount("ollama", "/root/.ollama")
	.WithExposedPort(11434)
	.Build();
	
var keepAlive = true;

container.Started += async (sender, args) => 
{
	Console.WriteLine("Container started");
	try
	{
		Console.WriteLine("Start ollama service");
		await container.ExecAsync(new List<string>() {
			"ollama run llama2"
		});
		var result = await RunAsync();
		result.Dump();
	}
	catch (Exception ex)
	{
		Console.WriteLine(ex);
	}
	
	keepAlive = false;
};

await container.StartAsync();

while (!QueryCancelToken.IsCancellationRequested && keepAlive)
{
	await Task.Delay(1000);
}

var logs = await container.GetLogsAsync();

await container.StopAsync();
await container.DisposeAsync();

async Task<string> RunAsync()
{
	try
	{
		var client = new OllamaApiClient("http://localhost:11434/api/generate", "llama2");
		ChatRequest request = new ChatRequest();
		request.Messages ??= new List<Message>();
		request.Messages.Add(new Message(ChatRole.User, "Write me a dad-joke about cats please?", Enumerable.Empty<string>().ToArray()));
		request.Model = "llama2";
		request.Stream = false;
		var messages = new List<ChatResponseStream>();
		var responses = await client.SendChat(request, x => messages.Add(x));
		return string.Join("\n\n", responses.Select(r => r.Content));
	}
	catch (Exception ex)
	{
		return ex.ToString();
	}
}

Benefit

Its really cool

Alternatives

To be less cool

Would you like to help contributing this enhancement?

Yes

@frankhaugen frankhaugen added the enhancement New feature or request label Jan 24, 2024
@frankhaugen
Copy link
Author

The output from a non-initial run:

[testcontainers.org 00:00:00.07] Connected to Docker:
Host: npipe://./pipe/docker_engine
Server Version: 24.0.7
Kernel Version: 5.10.102.1-microsoft-standard-WSL2
API Version: 1.43
Operating System: Docker Desktop
Total Memory: 62.39 GB
[testcontainers.org 00:00:00.30] Docker container 042f3c377795 created
[testcontainers.org 00:00:00.35] Start Docker container 042f3c377795
[testcontainers.org 00:00:01.67] Wait for Docker container 042f3c377795 to complete readiness checks
[testcontainers.org 00:00:01.69] Docker container 042f3c377795 ready
[testcontainers.org 00:00:01.73] Docker container a9d04460af00 created
[testcontainers.org 00:00:01.76] Start Docker container a9d04460af00
[testcontainers.org 00:00:02.01] Wait for Docker container a9d04460af00 to complete readiness checks
[testcontainers.org 00:00:02.02] Docker container a9d04460af00 ready
Container started
Start ollama service
[testcontainers.org 00:00:02.03] Execute "ollama run llama2" at Docker container a9d04460af00
Write me a dad-joke about cats please?
Why did the cat join a band? Because he wanted to be the purr-cussionist! 😹
[testcontainers.org 00:00:08.85] Stop Docker container a9d04460af00
[testcontainers.org 00:00:09.24] Delete Docker container a9d04460af00

@HofmeisterAn
Copy link
Collaborator

Hi 👋, thanks for creating the request.

Total Memory: 62.39 GB

This sounds like you are running a pretty nice machine 😄.

Since you are interested in contributing the Ollama module, which is great, I would like to share some additional information. You will find a basic guide about implementing a module here. All of our existing modules can be used as a blueprint too. All of them follow our best practices, which are necessary for a smoothly working module across various systems, etc.

If you have any further questions, do not hesitate to reach out to me or the other developers in our Testcontainers community on Slack.

@frankhaugen
Copy link
Author

If you have any further questions, do not hesitate to reach out to me or the other developers in our Testcontainers community on Slack.

So i spent the day, (in the few moments between my dayjob's demands for attention), got it pretty nice, then when I was going to go through the best practices, and there's a stumbling block for. I might not need Volume to be statically named, but a default and configurable one would save on time as the there are resources downloaded.

From the best practices:

Avoid assigning static names to Docker resources such as containers, networks, and volumes. Static names may clash with existing resources, causing tests to fail. Instead, use random assigned names. By default, Docker assigns random names.

I will make the PR, and link this issue and we can discuss when actually looking at the

@frankhaugen frankhaugen linked a pull request Jan 25, 2024 that will close this issue
frankhaugen added a commit to frankhaugen/testcontainers-dotnet that referenced this issue Feb 1, 2024
This commit updates the Ollama configuration to allow more customization options and removes unnecessary test helpers. The OllamaConfiguration class was refactored to provide more configurable parameters such as the VolumePath and VolumeName. Additionally, the TestOutputHelperExtensions and TestOutputLogger classes were deleted as they were not providing any significant value.
@HofmeisterAn HofmeisterAn added the module An official Testcontainers module label Mar 11, 2024
@duranserkan
Copy link

Is there any update on this?

@HofmeisterAn
Copy link
Collaborator

We had some technical limitations adding more modules to the repository. We have partially addressed this issue (#1136). Right now, I have a couple of days off, so I might find some time to do reviews, but I am not prioritizing it and plan to spend the time with family. Module PRs are not critical in my opinion; developers can always fall back to the generic container builder and use the configurations from Init() (in the PR). I am aware that there are a couple of modules waiting for review, and I will try to address them after my time off. I try to address urgent bugs or blockers as soon as possible.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request module An official Testcontainers module
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants