A helpful chatbot trained on Open Bank Project API documentation
-
Install docker compose if you haven't already
-
Copy .env-example to .env
- Copy over your OpenAI API Key
-
Make sure you have a
./certs/public_key.pem
file. Make the directory and copy over the key from API Explorer II if not.
Change the .env file to have
ENDPOINT_METADATA_PATH=./vector-database/endpoint_metadata.json
GLOSSAY_METADATA_PATH=./vector-database/glossary_metadata.json
ENDPOINT_VECTOR_DATABASE_PATH=./vector-database/endpoint_index.faiss
GLOSSARY_VECTOR_DATABASE_PATH=./vector-database/glossary_index.faiss
sudo docker compose build
sudo docker compose up
Note: If you are running this on your local docker engine and you already have an instance of redis running, you may need to change the REDIS_PORT
in the env file to avoid clashing.
The chat endpoint should now be running at http://127.0.0.1:5000/chat
The best way to interact with Opey locally is by running a version of Api Explorer II Locally and using the chat widget.
Else you can chat with the bot using curl (or whatever http client you like):
curl -XPOST -H "Content-type: application/json" -d '{
"session_id": "123456789", "obp_api_host": "https://test.openbankproject.com", "message": "Which endpoint would I use to create a new customer at a bank?"
}' 'http://127.0.0.1:5000/chat'
pip install -r requirements.txt
- Copy .env-example to .env
- Copy over your OpenAI API Key
- For running locally, set
REDIS_HOST=localhost
- You will also need redis set up and running locally, find instructions for this here
We need to register the OBP API documentation in a vector index, run:
python create_vector_index.py
For development:
python app.py
For production we use gunicorn:
gunicorn --bind 0.0.0.0:5000 aop:app
Same as for docker (see above)