Skip to content

Commit

Permalink
Replace llava with llava-phi3
Browse files Browse the repository at this point in the history
  • Loading branch information
ThomasVitale committed Jul 10, 2024
1 parent 7c8fe26 commit 75a919d
Show file tree
Hide file tree
Showing 3 changed files with 6 additions and 6 deletions.
8 changes: 4 additions & 4 deletions 02-prompts/prompts-multimodality-ollama/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,10 +9,10 @@ The application relies on Ollama for providing LLMs. You can either run Ollama l
### Ollama as a native application

First, make sure you have [Ollama](https://ollama.ai) installed on your laptop.
Then, use Ollama to run the _llava_ large language model. That's what we'll use in this example.
Then, use Ollama to run the _llava-phi3_ large language model. That's what we'll use in this example.

```shell
ollama run llava
ollama run llava-phi3
```

Finally, run the Spring Boot application.
Expand All @@ -23,15 +23,15 @@ Finally, run the Spring Boot application.

### Ollama as a dev service with Testcontainers

The application relies on the native Testcontainers support in Spring Boot to spin up an Ollama service with a _llava_ model at startup time.
The application relies on the native Testcontainers support in Spring Boot to spin up an Ollama service with a _llava-phi3_ model at startup time.

```shell
./gradlew bootTestRun
```

## Calling the application

You can now call the application that will use Ollama and _llava_ to generate text based on a default image.
You can now call the application that will use Ollama and _llava-phi3_ to generate text based on a default image.
This example uses [httpie](https://httpie.io) to send HTTP requests.

```shell
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -3,5 +3,5 @@ spring:
ollama:
chat:
options:
model: llava
model: llava-phi3
temperature: 0.7
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ public class TestPromptsMultimodalityOllamaApplication {
@RestartScope
@ServiceConnection
OllamaContainer ollama() {
return new OllamaContainer(DockerImageName.parse("ghcr.io/thomasvitale/ollama-llava")
return new OllamaContainer(DockerImageName.parse("ghcr.io/thomasvitale/ollama-llava-phi3")
.asCompatibleSubstituteFor("ollama/ollama"));
}

Expand Down

0 comments on commit 75a919d

Please sign in to comment.