Skip to content

Latest commit

 

History

History
79 lines (55 loc) · 2.53 KB

README.md

File metadata and controls

79 lines (55 loc) · 2.53 KB

Learning Second-Order Attentive Context for Efficient Correspondence Pruning (AAAI2023)

Requirements & Compilation

Required packages are listed in requirements.txt.

The code is tested using Python-3.7.10 with PyTorch 1.7.1.

  1. Compile extra modules
cd utils/extend_utils
python build_extend_utils_cffi.py

According to your installation path of CUDA, you may need to revise the variables cuda_version in build_extend_utils_cffi.py.

Datasets & Pretrain Models

  1. Download the YFCC100M dataset and the SUN3D dataset from the OANet repository.

  2. Download pretrained models from here

  3. Unzip and arrange all files like the following.

data/
    ├── model/
        ├── ANANet/
                ├── build_model.yaml
                └── model_best.yaml
        .....
        └── your model/
    ├── yfcc100m/
    ├── sun3d_test/
    └── pair/
 

It should be noted that if you have downloaded YFCC100M or SUN3D on another path, you can redefine ''data_root'' in pose_dataset.py.

Evaluation

Evaluate on the YFCC100M :

python eval.py --name yfcc --cfg configs/eval/ANANet/yfcc.yaml

Evaluate on the SUN3D:

python eval.py --name sun3d --cfg configs/eval/ANANet/sun3d.yaml

Citation

@InProceedings{Ye_2023_AAAI,
    author      ={Ye, Xinyi and Zhao, Weiyue and Lu, Hao and Cao, Zhiguo},
    title       ={Learning Second-Order Attentive Context for Efficient Correspondence Pruning},
    booktitle   ={Proceedings of the AAAI Conference on Artificial Intelligence (AAAI)},
    month       ={Jun.}, 
    year        ={2023}, 
    pages       ={3250-3258},
    volume      ={37},
    number      ={3}
  } 

Acknowledgement

We have used codes from the following repositories, and we thank the authors for sharing their codes.

OANet: https://github.com/zjhthu/OANet

LMCNet:https://github.com/liuyuan-pal/LMCNet

SuperGlue: https://github.com/magicleap/SuperGluePretrainedNetwork