Skip to content

Commit

Permalink
Initial commit
Browse files Browse the repository at this point in the history
  • Loading branch information
mykolaskrynnyk committed Oct 31, 2024
0 parents commit 52b207d
Show file tree
Hide file tree
Showing 46 changed files with 4,040 additions and 0 deletions.
43 changes: 43 additions & 0 deletions .dockerignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
# Ignore Python cache files
__pycache__/
*.pyc
*.pyo
*.pyd

# Ignore virtual environment directories
venv/
env/
.venv/
.env/

# Jupyter Notebook checkpoints
.ipynb_checkpoints/

# Local configuration files
*.env
*.local

# Ignore test and coverage files
tests/
*.cover
.coverage
nosetests.xml
coverage.xml
*.log

# Ignore IDE/editor specific files
.vscode/
.idea/

# Ignore Docker files
Dockerfile
docker-compose.yml

# Ignore documentation files
docs/
*.md

# Ignore other unnecessary files
*.DS_Store
*.tmp
*.temp
2 changes: 2 additions & 0 deletions .gitattributes
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
# Auto detect text files and perform LF normalization
* text=auto
142 changes: 142 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,142 @@
# Byte-compiled / optimized / DLL files
__pycache__/
*.py[cod]
*$py.class

# C extensions
*.so

# Distribution / packaging
.Python
build/
develop-eggs/
dist/
downloads/
eggs/
.eggs/
lib/
lib64/
parts/
sdist/
var/
wheels/
share/python-wheels/
*.egg-info/
.installed.cfg
*.egg
MANIFEST

# PyInstaller
# Usually these files are written by a python script from a template
# before PyInstaller builds the exe, so as to inject date/other infos into it.
*.manifest
*.spec

# Installer logs
pip-log.txt
pip-delete-this-directory.txt

# Unit test / coverage reports
htmlcov/
.tox/
.nox/
.coverage
.coverage.*
.cache
nosetests.xml
coverage.xml
*.cover
*.py,cover
.hypothesis/
.pytest_cache/
cover/

# Translations
*.mo
*.pot

# Django stuff:
*.log
local_settings.py
db.sqlite3
db.sqlite3-journal

# Flask stuff:
instance/
.webassets-cache

# Scrapy stuff:
.scrapy

# Sphinx documentation
docs/_build/

# PyBuilder
.pybuilder/
target/

# Jupyter Notebook
.ipynb_checkpoints

# IPython
profile_default/
ipython_config.py

# pyenv
# For a library or package, you might want to ignore these files since the code is
# intended to run in multiple environments; otherwise, check them in:
# .python-version

# pipenv
# According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control.
# However, in case of collaboration, if having platform-specific dependencies or dependencies
# having no cross-platform support, pipenv may install dependencies that don't work, or not
# install all needed dependencies.
#Pipfile.lock

# PEP 582; used by e.g. github.com/David-OConnor/pyflow
__pypackages__/

# Celery stuff
celerybeat-schedule
celerybeat.pid

# SageMath parsed files
*.sage.py

# Environments
.env
.venv
env/
venv/
ENV/
env.bak/
venv.bak/

# Spyder project settings
.spyderproject
.spyproject

# Rope project settings
.ropeproject

# mkdocs documentation
/site

# mypy
.mypy_cache/
.dmypy.json
dmypy.json

# Pyre type checker
.pyre/

# pytype static type analyzer
.pytype/

# Cython debug symbols
cython_debug/

# Manually added for this project
.idea/
**/.DS_Store
10 changes: 10 additions & 0 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
FROM python:3.11.7-slim
RUN apt-get update -y \
&& apt-get install libpq-dev -y \
&& rm -rf /var/lib/apt/lists/*
WORKDIR /app
COPY requirements.txt .
RUN pip install --no-cache-dir --upgrade -r requirements.txt
COPY . .
EXPOSE 8000
CMD ["uvicorn", "main:app", "--host", "0.0.0.0", "--port", "8000"]
28 changes: 28 additions & 0 deletions LICENSE
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
BSD 3-Clause License

Copyright (c) 2024, United Nations Development Programme

Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are met:

1. Redistributions of source code must retain the above copyright notice, this
list of conditions and the following disclaimer.

2. Redistributions in binary form must reproduce the above copyright notice,
this list of conditions and the following disclaimer in the documentation
and/or other materials provided with the distribution.

3. Neither the name of the copyright holder nor the names of its
contributors may be used to endorse or promote products derived from
this software without specific prior written permission.

THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
8 changes: 8 additions & 0 deletions Makefile
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
install:
pip install --upgrade pip && pip install -r requirements_dev.txt
format:
isort . --profile black --multi-line 3 && black .
lint:
pylint main.py src/
test:
python -m pytest tests/
148 changes: 148 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,148 @@
# Future Trends and Signals System (FTSS) API

[![Python 3.11+](https://img.shields.io/badge/python-3.11+-blue.svg)](https://www.python.org/downloads/release/python-3110/)
[![License](https://img.shields.io/github/license/undp-data/ftss-api)](https://github.com/undp-data/ftss-api/blob/main/LICENSE)
[![Black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)
[![Imports: isort](https://img.shields.io/badge/%20imports-isort-%231674b1?style=flat&labelColor=ef8336)](https://pycqa.github.io/isort/)
[![Conventional Commits](https://img.shields.io/badge/Conventional%20Commits-1.0.0-%23FE5196?logo=conventionalcommits&logoColor=white)](https://conventionalcommits.org)

This repository hosts the API that powers the [UNDP Future Trends and Signals System](https://signals.data.undp.org) (FTSS).
The API is written using [FastAPI](https://fastapi.tiangolo.com) in Python and deployed on Azure App Services.
It serves as an intermediary between the front-end application and back-end database. The codebase is an open-source
of the original project transferred from Azure DevOps.

## Table of Contents

- [Introduction](#introduction)
- [Getting Started](#getting-started)
- [Build and Test](#build-and-test)
- [Contribute](#contribute)
- [License](#license)
- [Contact](#contact)

## Introduction

The FTSS is an internal system built for the staff of the United Nations Development Programme, designed to capture
signals of change, and identify emerging trends within and outside the organisation. This repository hosts the back-end
API that powers the platform and is accompanied by the [front-end repository](https://github.com/undp-data/fe-signals-and-trends).

The API is written and tested in Python `3.11` using [FastAPI](https://fastapi.tiangolo.com) framework. Database and
storage routines are implemented in an asynchronous manner, making the application fast and responsive. The API is
deployed on Azure App Services to development and production environments from `dev` and `main` branches
respectively. The API interacts with a PostgreSQL database deployed as an Azure Database for PostgreSQL instance. The
instance comprises `staging` and `production` databases. An Azure Blob Storage container stores images used as
illustrations for signals and trends. The simplified architecture of the whole application is shown in the image below.

Commits to `staging` branch in the front-end repository and `dev` branch in this repository trigger CI/CD pipelines for
the staging environment. While there is a single database instance, the data in the staging environment is isolated in
the `staging` database/schema separate from `production` database/schema within the same database instance. The same
logic applies to the blob storage – images uploaded in the staging environment are managed separately from those in the
production environment.

![Preview](images/architecture.drawio.svg)

Authentication in the API happens via tokens (JWT) issued by Microsoft Entra upon user log-in in the front-end
application. Some endpoints to retrieve approved signals/trends are accessible with a static API key
for integration with other applications.

## Getting Started

For running the application locally, you can use either your local environment or a Docker container. Either way,
clone the repository and navigate to the project directory first:

```shell
# Clone the repository
git clone https://github.com/undp-data/ftss-api

# Navigate to the project folder
cd ftss-api
```

You must also ensure that the following environment variables are set up:

```text
# Authentication
TENANT_ID="<microsoft-entra-tenant-id>"
CLIENT_ID="<app-id>"
API_KEY="<strong-password>" # for accessing "public" endpoints
# Database and Storage
DB_CONNECTION="postgresql://<user>:<password>@<host>:5432/<staging|production>"
SAS_URL=""https://<account-name>.blob.core.windows.net/<container-name>?<sas-token>"
# Azure OpenAI, only required for `/signals/generation`
AZURE_OPENAI_ENDPOINT="https://<subdomain>.openai.azure.com/"
AZURE_OPENAI_API_KEY="<api-key>"
# Testing, only required to run tests, must be a valid token of a regular user
API_JWT="<json-token>"
```

### Local Environment

For this scenario, you will need a connection string to the staging database.

```bash
# Create and activate a virtual environment.
python3 -m venv venv
source venv/bin/activate

# Install core dependencies.
pip install -r requirements.txt

# Launch the application.
uvicorn main:app --reload
```

Once launched, the application will be running at http://127.0.0.1:8000.

### Docker Environment

For this scenario, you do not need a connection string as a fresh PostgreSQL instance will be
set up for you in the container. Ensure that Docker engine is running on you machine, then execute:

```shell
# Start the containers
docker compose up --build -d
```

Once launched, the application will be running at http://127.0.0.1:8000.

# Build and Test

The codebase provides some basic tests written in `pytest`. To run them, ensure you have specified a valid token in your
`API_JWT` environment variable. Then run:

```shell
# run all tests
python -m pytest tests/

# or alternatively
make test
```

Note that some tests for search endpoints might fail as the tests are run against dynamically changing databases.

# Contribute

All contributions must follow [Conventional Commits](https://www.conventionalcommits.org/en/v1.0.0/).
The codebase is formatted with `black` and `isort`. Use the provided [Makefile](Makefile) for these
routine operations. Make sure to run the linter against your code.

1. Clone or fork the repository
2. Create a new branch (`git checkout -b feature-branch`)
3. Make your changes
4. Ensure your code is properly formatted (`make format`)
5. Run the linter and check for any issues (`make lint`)
6. Execute the tests (`make test`)
7. Commit your changes (`git commit -m 'Add some feature'`)
8. Push to the branch (`git push origin feature-branch`)
9. Open a pull request to `dev` branch
10. Once tested in the staging environment, open a pull requests to `main` branch

## Contact

This project has been originally developed and maintained by [Data Futures Exchange (DFx)](https://data.undp.org) at UNDP.
If you are facing any issues or would like to make some suggestions, feel free to
[open an issue](https://github.com/undp-data/ftss-api/issues/new/choose).
For enquiries about DFx, visit [Contact Us](https://data.undp.org/contact-us).
Loading

0 comments on commit 52b207d

Please sign in to comment.