[ALGORITHM]
@inproceedings{lin2018bsn,
title={Bsn: Boundary sensitive network for temporal action proposal generation},
author={Lin, Tianwei and Zhao, Xu and Su, Haisheng and Wang, Chongjing and Yang, Ming},
booktitle={Proceedings of the European Conference on Computer Vision (ECCV)},
pages={3--19},
year={2018}
}
config | feature | gpus | pretrain | AR@100 | AUC | gpu_mem(M) | iter time(s) | ckpt | log | json |
---|---|---|---|---|---|---|---|---|---|---|
bsn_400x100_1x16_20e_activitynet_feature | cuhk_mean_100 | 1 | None | 74.66 | 66.45 | 41(TEM)+25(PEM) | 0.074(TEM)+0.036(PEM) | ckpt_tem ckpt_pem | log_tem log_pem | json_tem json_pem |
mmaction_video | 1 | None | 74.93 | 66.74 | 41(TEM)+25(PEM) | 0.074(TEM)+0.036(PEM) | ckpt_tem ckpt_pem | log_tem log_pem | json_tem json_pem | |
mmaction_clip | 1 | None | 75.19 | 66.81 | 41(TEM)+25(PEM) | 0.074(TEM)+0.036(PEM) | ckpt_tem ckpt_pem | log_tem log_pem | json_tem json_pem |
Notes:
- The gpus indicates the number of gpu we used to get the checkpoint. According to the Linear Scaling Rule, you may set the learning rate proportional to the batch size if you use different GPUs or videos per GPU, e.g., lr=0.01 for 4 GPUs x 2 video/gpu and lr=0.08 for 16 GPUs x 4 video/gpu.
- For feature column, cuhk_mean_100 denotes the widely used cuhk activitynet feature extracted by anet2016-cuhk, mmaction_video and mmaction_clip denote feature extracted by mmaction, with video-level activitynet finetuned model or clip-level activitynet finetuned model respectively.
For more details on data preparation, you can refer to ActivityNet feature in Data Preparation.
You can use the following commands to train a model.
python tools/train.py ${CONFIG_FILE} [optional arguments]
Examples:
-
train BSN(TEM) on ActivityNet features dataset.
python tools/train.py configs/localization/bsn/bsn_tem_400x100_1x16_20e_activitynet_feature.py
-
train BSN(PEM) on PGM results.
python tools/train.py configs/localization/bsn/bsn_pem_400x100_1x16_20e_activitynet_feature.py
For more details and optional arguments infos, you can refer to Training setting part in getting_started.
You can use the following commands to inference a model.
-
For TEM Inference
# Note: This could not be evaluated. python tools/test.py ${CONFIG_FILE} ${CHECKPOINT_FILE} [optional arguments]
-
For PGM Inference
python tools/bsn_proposal_generation.py ${CONFIG_FILE} [--mode ${MODE}]
-
For PEM Inference
python tools/test.py ${CONFIG_FILE} ${CHECKPOINT_FILE} [optional arguments]
Examples:
-
Inference BSN(TEM) with pretrained model.
python tools/test.py configs/localization/bsn/bsn_tem_400x100_1x16_20e_activitynet_feature.py checkpoints/SOME_CHECKPOINT.pth
-
Inference BSN(PGM) with pretrained model.
python tools/bsn_proposal_generation.py configs/localization/bsn/bsn_pgm_400x100_activitynet_feature.py --mode train
-
Inference BSN(PEM) with evaluation metric 'AR@AN' and output the results.
# Note: If evaluated, then please make sure the annotation file for test data contains groundtruth. python tools/test.py configs/localization/bsn/bsn_pem_400x100_1x16_20e_activitynet_feature.py checkpoints/SOME_CHECKPOINT.pth --eval AR@AN --out results.json
You can use the following commands to test a model.
-
TEM
# Note: This could not be evaluated. python tools/test.py ${CONFIG_FILE} ${CHECKPOINT_FILE} [optional arguments]
-
PGM
python tools/bsn_proposal_generation.py ${CONFIG_FILE} [--mode ${MODE}]
-
PEM
python tools/test.py ${CONFIG_FILE} ${CHECKPOINT_FILE} [optional arguments]
Examples:
-
Test a TEM model on ActivityNet dataset.
python tools/test.py configs/localization/bsn/bsn_tem_400x100_1x16_20e_activitynet_feature.py checkpoints/SOME_CHECKPOINT.pth
-
Test a PGM model on ActivityNet dataset.
python tools/bsn_proposal_generation.py configs/localization/bsn/bsn_pgm_400x100_activitynet_feature.py --mode test
-
Test a PEM model with with evaluation metric 'AR@AN' and output the results.
python tools/test.py configs/localization/bsn/bsn_pem_400x100_1x16_20e_activitynet_feature.py checkpoints/SOME_CHECKPOINT.pth --eval AR@AN --out results.json
Notes:
-
(Optional) You can use the following command to generate a formatted proposal file, which will be fed into the action classifier (Currently supports only SSN and P-GCN, not including TSN, I3D etc.) to get the classification result of proposals.
python tools/data/activitynet/convert_proposal_format.py
For more details and optional arguments infos, you can refer to Test a dataset part in getting_started.