This repo implements a machine learning system for detecting and tracking roosts in weather surveillance radar data. Roost detection is based on Detectron2 using PyTorch.
- checkpoints is for trained model checkpoints
- development is for developing detection models
- src is for system implementation
- data
- downloader downloads radar scans based on station and day; scan keys and directories for downloaded scans are based on UTC dates
- renderer renders numpy arrays from downloaded scans, visualizes arrays, and deletes the scans after rendering; directories for rendered arrays and images are based on UTC dates
- detection
- evaluation contains customized evaluation adapted from pycocotools v2.0.2
- tracking
- utils contains various utils, scripts to postprocess roost tracks, and scripts to generate visualization
- data
- tools is for system deployment
- demo.py downloads radar scans, renders arrays to be processed by models and some channels as images for
visualization, detects and tracks roosts in them, and postprocesses the results.
- launch_demo.py can call sbatch demo.sh multiple times to launch multiple jobs in parallel,
each for a station-year and on separate cpus. It is configured for birds.
- launch_demo_bats.py is configured for bats.
- demo.sh includes commands to run for each station-year, including running demo.py and pushing outputs from the computing cluster to our doppler server.
- launch_demo.py can call sbatch demo.sh multiple times to launch multiple jobs in parallel,
each for a station-year and on separate cpus. It is configured for birds.
- gen_deploy_station_days_scripts.py can create a launch*.py file and corresponding *.sh files, when we want each slurm job to include multiple calls to demo.py (e.g., process several time periods at a station within one slurm job).
- publish_images.sh sends images generated during system deployment to a server where we archive data. This has been incorporated into demo.sh.
- (outdated) demo.ipynb is for interactively running the system and not actively maintained
- (customization) launch_demo_tiff.py, demo_tiff.sh, demo_tiff.py are customized given rendered arrays as tiff files.
- (depreciated) add_local_time_to_output_files.py takes in scans*.txt and tracks*.txt files produced by system deployment and append local time to each line. Now the system should handle it automatically.
- (depreciated) post_hoc_counting takes in tracks* files and compute estimated numbers of animals in each bounding box. Now the system should handle it automatically.
- demo.py downloads radar scans, renders arrays to be processed by models and some channels as images for
visualization, detects and tracks roosts in them, and postprocesses the results.
-
See Detectron2 requirements here. Find a compatible pytorch version here. To run detection with GPU, check the cuda version at, for example,
/usr/local/cuda
, or potentially bynvcc -V
.conda create -n roostsys python=3.8 conda activate roostsys # for development and inference with gpus, use the gpu version of torch; we assume cuda 11.3 here conda install pytorch==1.10.0 torchvision==0.11.0 cudatoolkit=11.3 -c pytorch -c conda-forge # for inference with cpus, use the cpu version of torch # conda install pytorch==1.10.0 torchvision==0.11.0 cpuonly -c pytorch git clone https://github.com/darkecology/roost-system.git cd roost-system pip install -e .
-
(Optional) Jupyter notebook.
pip install jupyter
- Add the python environment to jupyter:
conda install -c anaconda ipykernel python -m ipykernel install --user --name=roostsys
- To check which environments are in jupyter as kernels and to delete one:
jupyter kernelspec list jupyter kernelspec uninstall roostsys
- Run jupyter notebook on a server:
jupyter notebook --no-browser --port=9991
- Monitor from local:
ssh -N -f -L localhost:9990:localhost:9991 username@server
- Enter
localhost:9990
from a local browser tab
- development contains all training and evaluation scripts.
- To prepare a training dataset (i.e. rendering arrays from radar scans and generating json files to define datasets with annotations), refer to Installation and Dataset Preparation in the README of wsrdata.
- Before training, optionally run try_load_arrays.py to make sure there's no broken npz files.
Latest model checkpoints are available here.
- v1: Beginning of Summer 2021 Zezhou model.
- v2: End of Summer 2021 Wenlong model with 48 AP. Better backbone, anchors, and other config.
- v3: End of Winter 2021 Gustavo model with 55 AP. Adapter layer and temporal features.
A Colab notebook for running small-scale inference is here. Large-scale deployment can be run on CPU servers as follows.
-
Under checkpoints, download a trained detection checkpoint.
-
Configure AWS by
aws configure
in order to download radar scans. EnterAWS Access Key ID
andAWS Secret Access Key
as prompted,us-east-1
forDefault region name
, and nothing forDefault output format
. Review the updated AWS config.vim ~/.aws/credentials vim ~/.aws/config
-
Modify demo.py for system customization. For example, DET_CFG can be changed to adopt a new detector. CNT_CFG can be changed for different counting assumptions.
-
Make sure the environment is activated. Then consider two deployment scenarios.
-
In the first, we process consecutive days at stations, i.e. we launch one job for each set of continuous days at a station. Under tools, modify VARIABLES in launch_demo.py and run
python launch_demo.py
to submit jobs to slurm and process multiple batches of data. -
In the second, we process scattered days at stations, i.e. we launch one job for all days from each station. Modify VARIABLES in tools/gen_deploy_station_days_scripts.py. Under tools, run
python gen_deploy_station_days_scripts.py
and thenbash scripts/launch_deploy_station_days_scripts.sh
. Each output txt file save scans or tracks for one station-day: need to manually combine txt files for station-days from each same station. -
GOTCHA 1: EXPERIMENT_NAME needs to be carefully chosen; it'll correspond to the dataset name later used in the web UI.
-
GOTCHA 2: If there are previous batches processed under this EXPERIMENT_NAME (i.e. dataset to be loaded to the website), we can move previously processed data at the output directory to another location before saving newly processed data to this EXPERIMENT_NAME output directory. Thereby when we copy newly processed data to the server that hosts the web UI, previous data won't need to be copied again.
-
- geometric direction: large y is North (row 0 is South), large x is East
- image direction: large y is South (row 0 is North), large x is East
- Rendering
- Render arrays for the model to process in the geographic direction
- Render png images for visualization in the image direction
- Generate the list of scans with successfully rendered arrays
- Detector in the geographic direction
- During training and evaluation, doesn’t use our defined
Detector class
- dataloader: XYXY
- During deployment, use our defined Detector class which wraps a Predictor. The run function of this Detector flips the y axis of predicted boxes to get the image direction and outputs predicted boxes in xyr where xy are center coordinates
- During training and evaluation, doesn’t use our defined
Detector class
- For rain removal post-processing using dualpol arrays, flip the y axis to operate in the image direction
- Generate the list of predicted tracks to accompany png images in the image direction
In the generated csv files that can be imported to a user interface for visualization, the following information could be used to further filter the tracks:
- track length
- detection scores (-1 represents that the bbox is not from detector, instead, our tracking algorithm)
- bbox sizes
- the minutes from sunrise/sunset of the first bbox in a track
[1] Detecting and Tracking Communal Bird Roosts in Weather Radar Data. Zezhou Cheng, Saadia Gabriel, Pankaj Bhambhani, Daniel Sheldon, Subhransu Maji, Andrew Laughlin and David Winkler. AAAI, 2020 (oral presentation, AI for Social Impact Track).
[2] Using Spatio-Temporal Information in Weather Radar Data to Detect and Track Communal Bird Roosts. Gustavo Perez, Wenlong Zhao, Zezhou Cheng, Maria Carolina T. D. Belotti, Yuting Deng, Victoria F. Simons, Elske Tielens, Jeffrey F. Kelly, Kyle G. Horton, Subhransu Maji, Daniel Sheldon. Preprint.