CVPRW2023: Enhancing Multi-Camera People Tracking with Anchor-Guided Clustering and Spatio-Temporal Consistency ID Re-Assignment
This is the official repository for the 7th NVIDIA AI City Challenge (2023) Track 1: Multi-Camera People Tracking. [Arxiv]
The official dataset can be downloaded from the AI City Challenge website (https://www.aicitychallenge.org/2023-data-and-evaluation/). You need to fill out the dataset request form to obtain the password to download them.
Referring to the DATASET LICENSE AGREEMENT from the dataset author(s), we are not allowed to share the dataset.
2.c. ... you may not copy, sell, rent, sublicense, transfer or distribute the DATASET, or share with others.
The implementation of our work is built upon BoT-SORT, OpenMMLab, and torchreid. We also adapt Cal_PnP for camera self-calibration.
Four different environments are required for the reproduction process. Please install these three environments according to the following repos:
- Installation for mmyolo*
- Installation for mmpose
- Installation for torchreid*
- Installation for BoT-SORT
* optional for fast reproduce
- Prepare MTMC Dataset and annotations
Download
AIC23_Track1_MTMC_Tracking.zip
from AICity organizer and unzip under the root directory of this repo and run:
bash scripts/0_prepare_mtmc_data.sh
You should see the data
folder organized as follows:
data
├── annotations
│ ├── fine_tune
│ │ ├── train_hospital_val_hospital_sr_20_0_img_15197.json
│ │ ├── train_market_val_market_sr_20_0_img_19965.json
│ │ ├── train_office_val_office_sr_20_0_img_20696.json
│ │ └── train_storage_val_storage_sr_20_0_img_15846.json
│ └── train_all_val_all_sr_20_10_img_77154.json
├── train
│ ├── S002
│ │ ├── c008
│ │ │ ├── frame
│ │ │ ├── label.txt
│ │ │ └── video.mp4
│ . .
│ . .
├── validation
└── train
- Train yolov7 models on synthetic data
bash scripts/1_train_detector.sh
* Note that the configs we provided are the ones we used in our submission. They may not be optimized for your GPU, please adjust the batchsize accordingly.
- Prepare ReID Dataset
mkdir deep-person-reid/reid-data
Download our sampled dataset and unzip it under deep-person-reid/reid-data.
* Note that the file name DukeMTMC is just for training convenience, the DukeMTMC dataset is not used in our training process.
- Train Reid model on synthetic data
bash 2_train_reid.sh
- To Fast Reproduce
Directly use the txt files in the data/test_det
folder and skip the following steps.
- Prepare Models
- Download the pretrained YOLOX_x from ByteTrack [Google Drive]
- Download (or train from scratch) the YOLOv7 weights from [Google Drive]
- Get Real (S001) detection
bash scripts/3_inference_det_real.sh
- Get Synthetic detection
bash scripts/4_inference_det_syn.sh
-
To Fast Reproduce Download the embedding npy files and put all the npy files under
data/test_emb
, then you can skip step 1 and 2. -
Prepare Models (optional)
- Download the ReID model for synthetic dataset
- Download the pretrained ReID models from torchreid. Including osnet_ms_m_c, osnet_ibn_ms_m_c, osnet_ain_ms_m_c, osnet_x1_0_market, osnet_x1_0_msmt17
- Put all the models in deep-person-reid/checkpoints
- Get Appearance Embedding (optional)
bash scripts/5_inference_emb.sh
The root_path for the following command should be set to the repo's location
- Navigate to the BoT-SORT folder
cd BoT-SORT
- Run tracking
conda activate botsort_env
python tools/run_tracking.py <root_path>
- Generate foot keypoint
conda activate mmpose
cd ../mmpose
python demo/top_down_video_demo_with_track_file.py <tracking_file.txt> \
configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnet_w48_coco_256x192.py \
https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w48_coco_256x192-b9e0b3ab_20200708.pth \
--video-path <video_file.mp4> \
--out-file <out_keypoint.json>
python tools/convert.py
- Conduct spatio-temporal consistency reassignment
python STCRA/run_stcra.py <input_tracking_file_folder> <output_tracking_file_folder>
- Generate final submission
cd ../BoT-SORT
python tools/aic_interpolation.py <root_path>
python tools/boundaryrect_removal.py <root_path>
python tools/generate_submission.py <root_path>