Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Automate dashboard scaffolding Part 2 #177

Draft
wants to merge 21 commits into
base: main
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
21 commits
Select commit Hold shift + click to select a range
bd64a99
Add pytest and coverage as additional dependencies for the automated …
iantei Oct 29, 2024
de3d126
Enable running of plots.py from pytest, aside from Notebook
iantei Oct 29, 2024
0f3099e
Add Git Action workflow
iantei Oct 29, 2024
5d8548a
Add docker-compose file for automated unit testing
iantei Oct 29, 2024
5995589
Initial unit test cases for plots.py
iantei Oct 29, 2024
63867f4
Add extra line at the end
iantei Oct 29, 2024
bba61b6
Simplify import of parent directory with the use of PYTHONPATH=../.. …
iantei Nov 5, 2024
e590384
Initial test cases for scaffolding.py
iantei Nov 12, 2024
b9e30cc
Fix run for tests with docker-compose in test_with_docker.yml
iantei Nov 20, 2024
0b1fe21
Remove mock.patch, and use direct function calls instead. mock.patch …
iantei Nov 20, 2024
3ddd56b
Update unit test cases for PURPOSE, REPLACED_MODE for mapping_labels …
iantei Nov 21, 2024
31f6082
Update unit test for mapping_color_surveys
iantei Nov 27, 2024
d86fbe2
Update unit tests for get_quality_text()
iantei Nov 27, 2024
1258944
Update unit tests for get_quality_text_sensed()
iantei Nov 27, 2024
6cca577
Update unit tests for get_quality_text_numerator()
iantei Nov 27, 2024
8952810
Update unit tests for get_file_suffix()
iantei Nov 27, 2024
aaf7a9e
Update unit tests for unit_conversions. Update distance column in the…
iantei Nov 28, 2024
f81c058
Update unit tests for filter_labeled_trips()
iantei Nov 28, 2024
ef760c6
Fix test_with_docker.yml file run command
iantei Nov 28, 2024
ef1cb25
Update unit tests for expand_userinputs()
iantei Nov 28, 2024
e1349a0
Extract dynamic_labels to create a pytest.fixture for reusability. Up…
iantei Nov 28, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
36 changes: 36 additions & 0 deletions .github/workflows/test_with_docker.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,36 @@
# This is a basic workflow to help you get started with Actions

name: test-with-docker

# Controls when the action will run. Triggers the workflow on push or pull request
# events but only for the master branch
on:
push:
branches: [ main ]
pull_request:
branches: [ main ]
schedule:
# * is a special character in YAML so you have to quote this string
- cron: '5 4 * * 0'

# A workflow run is made up of one or more jobs that can run sequentially or in parallel
jobs:
# This workflow contains a single job called "build"
build:
# The type of runner that the job will run on
runs-on: ubuntu-latest

# Steps represent a sequence of tasks that will be executed as part of the job
steps:
# Checks-out your repository under $GITHUB_WORKSPACE, so your job can access it
- name: Checkout
uses: actions/checkout@v2

- name: Make sure that the workflow works
run: echo Smoke test

- name: Run the tests using docker-compose
working-directory: .github/workflows
run: |
docker compose -f ../../docker-compose.tests.yml build
docker compose -f ../../docker-compose.tests.yml up --exit-code-from notebook-server
51 changes: 51 additions & 0 deletions docker-compose.tests.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,51 @@
version: "3"
services:
dashboard:
image: em-pub-dash-dev/frontend
build:
context: frontend
dockerfile: docker/Dockerfile.dev
depends_on:
- db
ports:
# DASH in numbers
- "3274:6060"
volumes:
- ./frontend:/public
- ./plots:/public/plots
networks:
- emission
notebook-server:
image: em-pub-dash-dev/viz-scripts
build:
context: viz_scripts
dockerfile: docker/Dockerfile.test
args:
SERVER_IMAGE_TAG: ${SERVER_IMAGE_TAG}
depends_on:
- db
environment:
- DB_HOST=db
- WEB_SERVER_HOST=0.0.0.0
- CRON_MODE=
- STUDY_CONFIG=stage-program
ports:
# ipynb in numbers
- "47962:47962"
networks:
- emission
volumes:
- ./viz_scripts:/usr/src/app/saved-notebooks
- ./plots:/plots
db:
image: mongo:4.4.0
volumes:
- mongo-data:/data/db
networks:
- emission

networks:
emission:

volumes:
mongo-data:
19 changes: 19 additions & 0 deletions viz_scripts/docker/Dockerfile.test
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
# python 3
ARG SERVER_IMAGE_TAG
FROM shankari/e-mission-server:master_${SERVER_IMAGE_TAG}

VOLUME /plots

ADD docker/environment36.dashboard.additions.yml /

WORKDIR /usr/src/app

RUN /bin/bash -c "source setup/activate.sh && conda env update --name emission --file setup/environment36.notebook.additions.yml"
RUN /bin/bash -c "source setup/activate.sh && conda env update --name emission --file /environment36.dashboard.additions.yml"

ADD docker/start_tests.sh /usr/src/app/.docker/start_tests.sh
RUN chmod u+x /usr/src/app/.docker/start_tests.sh

EXPOSE 8888

CMD ["/bin/bash", "/usr/src/app/.docker/start_tests.sh"]
2 changes: 2 additions & 0 deletions viz_scripts/docker/environment36.dashboard.additions.yml
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,8 @@ channels:
- defaults
dependencies:
- seaborn=0.11.1
- pytest
- coverage
- pip:
- nbparameterise==0.6
- devcron==0.4
13 changes: 13 additions & 0 deletions viz_scripts/docker/start_tests.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
#!/bin/bash
set -e # Exit on error

# change python environment
pwd
source setup/activate.sh || exit 1
conda env list
cd saved-notebooks/tests || exit 1

echo "Starting unit tests..."
PYTHONPATH=../.. coverage run -m pytest . -v

coverage report
12 changes: 11 additions & 1 deletion viz_scripts/plots.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,17 @@

sns.set_style("whitegrid")
sns.set()
get_ipython().run_line_magic('matplotlib', 'inline')

try:
# Import the function
from IPython import get_ipython
# Check if running in an IPython environment (like Jupyter Notebook)
if get_ipython() is not None:
get_ipython().run_line_magic('matplotlib', 'inline')
except ImportError:
# Handle the case where IPython is not installed
# We are running in regular Python (likely pytest), not Jupyter/IPython
pass

# Module for pretty-printing outputs (e.g. head) to help users
# understand what is going on
Expand Down
58 changes: 58 additions & 0 deletions viz_scripts/tests/test_plots.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,58 @@
import pytest as pytest
import pandas as pd
import numpy as np
# Using import_module, as we have saved-notebooks as the directory
import importlib
plots = importlib.import_module('saved-notebooks.plots')

# Test Data Fixtures
@pytest.fixture
def sample_labels():
return ['Car', 'Bus', 'Train', 'Walk']

@pytest.fixture
def sample_values():
return [100, 50, 3, 1]

@pytest.fixture
def sample_labels_no_small():
return ['Car', 'Bus']


@pytest.fixture
def sample_values_no_small():
return [100, 100]

class TestCalculatePct:
def test_calculate_pct_basic(self, sample_labels, sample_values):
labels, values, pcts = plots.calculate_pct(sample_labels, sample_values)
assert len(labels) == len(sample_labels)
assert len(values) == len(sample_values)
assert sum(pcts) == pytest.approx(100.0, abs=0.1)

def test_calculate_pct_empty(self):
labels, values, pcts = plots.calculate_pct([],[])
assert len(labels) == 0
assert len(values) == 0
assert len(pcts) == 0

def test_calculate_pct_single(self):
labels, values, pcts = plots.calculate_pct(['Car'], [100])
assert pcts == [100.0]

class TestMergeSmallEntries:
def test_merge_small_entries_basic(self, sample_labels, sample_values):
labels, values, pcts = plots.merge_small_entries(sample_labels, sample_values)
assert all(pct > 2.0 for pct in pcts)

def test_merge_small_entries_no_small(self, sample_labels_no_small, sample_values_no_small):
result_labels, result_values, result_pcts = plots.merge_small_entries(sample_labels_no_small, sample_values_no_small)
assert len(result_labels) == 2
assert 'other' not in result_labels
assert 'OTHER' not in result_labels

def test_merge_small_entries_some_small(self, sample_labels, sample_values):
result_labels, result_values, result_pcts = plots.merge_small_entries(sample_labels, sample_values)
print(result_labels)
assert len(result_labels) == 3
assert result_labels[0] in ['Car', 'Bus','other', 'OTHER']
Loading