Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature #14

Closed
wants to merge 2 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
162 changes: 162 additions & 0 deletions backend/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,162 @@
# Byte-compiled / optimized / DLL files
__pycache__/
*.py[cod]
*$py.class

# C extensions
*.so

# Distribution / packaging
.Python
build/
develop-eggs/
dist/
downloads/
eggs/
.eggs/
lib/
lib64/
parts/
sdist/
var/
wheels/
share/python-wheels/
*.egg-info/
.installed.cfg
*.egg
MANIFEST

# PyInstaller
# Usually these files are written by a python script from a template
# before PyInstaller builds the exe, so as to inject date/other infos into it.
*.manifest
*.spec

# Installer logs
pip-log.txt
pip-delete-this-directory.txt

# Unit test / coverage reports
htmlcov/
.tox/
.nox/
.coverage
.coverage.*
.cache
nosetests.xml
coverage.xml
*.cover
*.py,cover
.hypothesis/
.pytest_cache/
cover/

# Translations
*.mo
*.pot

# Django stuff:
*.log
local_settings.py
db.sqlite3
db.sqlite3-journal

# Flask stuff:
instance/
.webassets-cache

# Scrapy stuff:
.scrapy

# Sphinx documentation
docs/_build/

# PyBuilder
.pybuilder/
target/

# Jupyter Notebook
.ipynb_checkpoints

# IPython
profile_default/
ipython_config.py

# pyenv
# For a library or package, you might want to ignore these files since the code is
# intended to run in multiple environments; otherwise, check them in:
# .python-version

# pipenv
# According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control.
# However, in case of collaboration, if having platform-specific dependencies or dependencies
# having no cross-platform support, pipenv may install dependencies that don't work, or not
# install all needed dependencies.
#Pipfile.lock

# poetry
# Similar to Pipfile.lock, it is generally recommended to include poetry.lock in version control.
# This is especially recommended for binary packages to ensure reproducibility, and is more
# commonly ignored for libraries.
# https://python-poetry.org/docs/basic-usage/#commit-your-poetrylock-file-to-version-control
#poetry.lock

# pdm
# Similar to Pipfile.lock, it is generally recommended to include pdm.lock in version control.
#pdm.lock
# pdm stores project-wide configurations in .pdm.toml, but it is recommended to not include it
# in version control.
# https://pdm-project.org/#use-with-ide
.pdm.toml
.pdm-python
.pdm-build/

# PEP 582; used by e.g. github.com/David-OConnor/pyflow and github.com/pdm-project/pdm
__pypackages__/

# Celery stuff
celerybeat-schedule
celerybeat.pid

# SageMath parsed files
*.sage.py

# Environments
.env
.venv
env/
venv/
ENV/
env.bak/
venv.bak/

# Spyder project settings
.spyderproject
.spyproject

# Rope project settings
.ropeproject

# mkdocs documentation
/site

# mypy
.mypy_cache/
.dmypy.json
dmypy.json

# Pyre type checker
.pyre/

# pytype static type analyzer
.pytype/

# Cython debug symbols
cython_debug/

# PyCharm
# JetBrains specific template is maintained in a separate JetBrains.gitignore that can
# be found at https://github.com/github/gitignore/blob/main/Global/JetBrains.gitignore
# and can be added to the global gitignore or merged into this file. For a more nuclear
# option (not recommended) you can uncomment the following to ignore the entire idea folder.
#.idea/
23 changes: 23 additions & 0 deletions backend/.pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
repos:

# Versioning: Commit messages & changelog
- repo: https://github.com/commitizen-tools/commitizen
rev: v3.27.0
hooks:
- id: commitizen
stages: [commit-msg]

# Lint / autoformat: Python code
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: "v0.4.8"
hooks:
- id: ruff
- id: ruff-format

# Trailing whitespace
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v2.3.0
hooks:
- id: check-yaml
- id: end-of-file-fixer
- id: trailing-whitespace
82 changes: 44 additions & 38 deletions backend/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
## Backend is created with [Django](https://www.djangoproject.com/)
This project was bootstrapped with [Geodjango Template](https://github.com/itskshitiz321/geodjangotemplate.git)
#### For Quickly Getting Started
**Note:** Depending upon your OS and Env installation will vary, This project tightly depends on [Tensorflow](https://www.tensorflow.org/install/pip) with GPU support so accordingly build your development environment
**Note:** Depending upon your OS and Env installation will vary, This project tightly depends on [Tensorflow](https://www.tensorflow.org/install/pip) with GPU support so accordingly build your development environment
### Install Python3, pip and virtualenv first
##### Skip this, step if you already have one
sudo apt-get install python3
Expand All @@ -19,13 +19,13 @@ This project was bootstrapped with [Geodjango Template](https://github.com/itsk
sudo apt-get install git-lfs
```

- Clone Ramp Basemodel
- Clone Ramp Basemodel
```
git clone https://github.com/radiantearth/model_ramp_baseline.git
```

- Clone Ramp - Code
Note: This clone location will be your RAMP_HOME
- Clone Ramp - Code
Note: This clone location will be your RAMP_HOME
```
git clone https://github.com/kshitijrajsharma/ramp-code-fAIr.git ramp-code
```
Expand All @@ -36,19 +36,19 @@ cp -r model_ramp_baseline/data/input/checkpoint.tf ramp-code/ramp/checkpoint.tf
```


- Remove basemodel repo we don't need it anymore
- Remove basemodel repo we don't need it anymore
```
rm -rf model_ramp_baseline
```
- Install numpy
Numpy needs to be installed before gdal
- Install numpy
Numpy needs to be installed before gdal
```
pip install numpy==1.23.5
```

- Install gdal and rasetrio
Based on your env : You can either use conda / setup manually on your os
for eg on ubuntu :
- Install gdal and rasetrio
Based on your env : You can either use conda / setup manually on your os
for eg on ubuntu :
```
sudo add-apt-repository ppa:ubuntugis/ppa && sudo apt-get update
sudo apt-get install gdal-bin
Expand All @@ -58,12 +58,12 @@ export C_INCLUDE_PATH=/usr/include/gdal
pip install --global-option=build_ext --global-option="-I/usr/include/gdal" GDAL==`gdal-config --version`
```

- Install Ramp - Dependecies
- Install Ramp - Dependecies
```
cd ramp-code && cd colab && make install
```

- For Conda users : You may need to install rtree, gdal , rasterio & imagecodecs separately
- For Conda users : You may need to install rtree, gdal , rasterio & imagecodecs separately

```
conda install -c conda-forge rtree
Expand All @@ -82,14 +82,14 @@ conda install -c conda-forge imagecodecs
pip install --upgrade setuptools
```

- Install fAIr Utilities
- Install fAIr Utilities
```
pip install hot-fair-utilities==1.0.41
```

**Remember In order to run fAIr , You need to configure your PC with tensorflow - GPU Support**
**Remember In order to run fAIr , You need to configure your PC with tensorflow - GPU Support**

You can check your GPU by :
You can check your GPU by :

```
import tensorflow as tf
Expand All @@ -98,82 +98,88 @@ print("Num GPUs Available: ", len(tf.config.experimental.list_physical_devices('


- Install psycopg2
Again based on your os/env you can do manual installation
for eg : on ubuntu :
Again based on your os/env you can do manual installation
for eg : on ubuntu :
```
sudo apt-get install python3-psycopg2
```

- Install redis server on your pc
- Install redis server on your pc

```
sudo apt install redis
```

- Finally installl pip dependencies
- Install pdm for dependency management

```
pip install -r requirements.txt
pip install pdm
```

- Finally install project dependencies

```
pdm install
```

### Make sure you have postgresql installed with postgis extension enabled


#### Configure .env:
#### Configure .env:
Create .env in the root backend project , and add the credentials as provided on .env_sample , Export your secret key and database url to your env

Export your database url
Export your database url
```
export DATABASE_URL=postgis://postgres:postgres@localhost:5432/ai
```
You will need more env variables (Such as Ramp home, Training Home) that can be found on ```.sample_env```

You will need more env variables (Such as Ramp home, Training Home) that can be found on ```.sample_env```

#### Now change your username, password and db name in settings.py accordingly to your database
python manage.py makemigrations login core
python manage.py migrate
python manage.py runserver
### Now server will be available in your 8000 port on web, you can check out your localhost:8000/admin for admin panel
### Now server will be available in your 8000 port on web, you can check out your localhost:8000/admin for admin panel
To login on admin panel, create your superuser and login with your credentials restarting the server

python manage.py createsuperuser

## Authentication
## Authentication
fAIr uses oauth2.0 Authentication using [osm-login-python](https://github.com/kshitijrajsharma/osm-login-python)
1. Get your login Url
Hit ```/api/v1/auth/login/ ```
- URL will give you login URL which you can use to provide your osm credentials and authorize fAIr
- URL will give you login URL which you can use to provide your osm credentials and authorize fAIr
- After successful login you will get access-token that you can use across all osm login required endpoints in fAIr
2. Check authentication by getting back your data
2. Check authentication by getting back your data
Hit ```/api/v1/auth/me/```
- URL requires access-token as header and in return you will see your osm username, id and image url
- URL requires access-token as header and in return you will see your osm username, id and image url


## Start celery workers
## Start celery workers

- Start celery workers
- Start celery workers

```
celery -A aiproject worker --loglevel=debug -n my_worker
```

- Monitor using flower
if you are using redis as result backend, api supports both options django / redis
- Monitor using flower
if you are using redis as result backend, api supports both options django / redis
You can start flower to start monitoring your tasks
```
celery -A aiproject --broker=redis://127.0.0.1:6379/0 flower
celery -A aiproject --broker=redis://127.0.0.1:6379/0 flower
```

## Run Tests
## Run Tests

```
python manage.py test
```


# Build fAIr with Docker for Development
- Install all the required drivers for your graphics to access it from containers, and check your graphics and drivers with ```nvidia-smi``` . Up to now only nvidia is Supported
- Follow docker_sample_env to create ```.env``` file in your dir
# Build fAIr with Docker for Development
- Install all the required drivers for your graphics to access it from containers, and check your graphics and drivers with ```nvidia-smi``` . Up to now only nvidia is Supported
- Follow docker_sample_env to create ```.env``` file in your dir
- Build the Image

```
Expand Down
Loading