A comprehensive solution for LLM integration and API key generation, deployed on the Akash Network.
This repository contains the complete codebase and deployment instructions for the AkashChat API service. The project consists of two main components:
- A frontend application for API key generation
- A backend service for LLM (Large Language Model) load balancing
- Purpose: User interface for API key generation
- Live Version: chatapi.akash.network
- Location:
/frontend
directory - Requirements:
- MongoDB instance running on port 27017 for user data storage
- A running instance of the LiteLLM backend
- Technology: LiteLLM
- Features:
- Intelligent load balancing across multiple LLM providers
- Unified API interface for various LLM services
- Redis caching for prompt caching
- Requirements:
- A working LiteLLM configuration file - example can be found in the
deployment/config.yaml
file
- A working LiteLLM configuration file - example can be found in the
- Deployment configuration is available in the
deployment
directory - Use
deploy.yml
for deploying the service to Akash Network - Additional configuration options can be found in
deployment/config.yaml
Please refer to the respective directories for detailed setup instructions:
- Frontend setup: See
/frontend/README.md
- Deployment guide: Check the files in
/deployment