A thin wrapper around an vector-store-enriched conversation with an LLM chatbot.
See the infrastructure README for details.
- Populate
.env
with local environment variables. - Run
npm install
infrontend/
(usemake setup
), thennpm run dev
to start a development server. - Run
docker-compose up
in root
We use Cypress.
- Populate
frontend/.env
with Cognito credentials. - Start frontend and backend servers.
- Run
npm run cypress:open
infrontend/
.
- Create a local virtualenv from
backend/requirements.txt
andbackend/requirements-dev.txt
. - Run
pytest
(orptw backend/
for continuouspytest
s)