Skip to content

llamara-ai/llamara-deployment-docker

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

LLAMARA Docker Deployment

LLAMARA - Large Language Assistant for Model-Augmented Retrieval and Analysis - is an LLM-based assistant for information retrieval from a provided knowledge base. It aims at supporting researchers working with scientific papers, whitepapers and documentation, as well as possibly serving research findings in an accessible way to the public.

This repository contains a Docker Compose file and a configuration template to deploy the LLAMARA distribution Docker container.

Dependencies

Authentication

This application requires an OIDC authentication provider to be set up. We recommend to use Keycloak as OIDC provider.

You need to add the microprofile-jwt and profile scopes for the Quarkus client, e.g. for Keycloak see Keycloak Server Documentation.

Configuration

Provide the required environment variables in a .env file:

QUARKUS_OIDC_CREDENTIALS_SECRET=
POSTGRES_PASSWORD=
MINIO_ROOT_PASSWORD=
MINIO_ACCESS_KEY=
MINIO_SECRET_KEY=
REDIS_PASSWORD=
QDRANT_API_KEY=

OPENAI_API_KEY=

Use application.yaml.sample to create a application.yaml in the config dir that provides the required configuration.

Refer to llamara-backend to learn about the supported configuration properties.

Deployment

Once you have completed the configuration step, start LLAMARA through Docker Compose:

docker compose up