This repository provides code of our paper "Exploring Inter-Channel Correlation for Diversity-preserved Knowledge Distillation". We provide training code on Cifar100 and evaluation code on ImageNet & Pascal VOC. The remaining training code will be released after the paper is accepted.
- Download all checkpoints from https://drive.google.com/drive/folders/1ZvwEAVJurTXSuPL_0HylHMNUGbG1zl3U?usp=sharing
- Build a docker image using the provided Dockerfile. All code should be run in the docker image.
Go into directory of each task and following the README there.
The License will be updated after the code is released without anonymity.
This work is built on three different repository, RepDistiller(ICLR 2020), torchdistill(ICPR 2020) and OverHaul(ICCV 2019). Thanks to their great work.