Skip to content

Latest commit

 

History

History
 
 

rkd

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
 
 

RKD

Relational Knowledge Distillation

Abstract

Knowledge distillation aims at transferring knowledge acquired in one model (a teacher) to another model (a student) that is typically smaller. Previous approaches can be expressed as a form of training the student to mimic output activations of individual data examples represented by the teacher. We introduce a novel approach, dubbed relational knowledge distillation (RKD), that transfers mutual relations of data examples instead. For concrete realizations of RKD, we propose distance-wise and angle-wise distillation losses that penalize structural differences in relations. Experiments conducted on different tasks show that the proposed method improves educated student models with a significant margin. In particular for metric learning, it allows students to outperform their teachers' performance, achieving the state of the arts on standard benchmark datasets.

pipeline

Results and models

Classification

Location Dataset Teacher Student Acc Acc(T) Acc(S) Config Download
neck ImageNet resnet34 resnet18 70.23 73.62 69.90 config teacher |model | log

Citation

@inproceedings{park2019relational,
  title={Relational knowledge distillation},
  author={Park, Wonpyo and Kim, Dongju and Lu, Yan and Cho, Minsu},
  booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
  pages={3967--3976},
  year={2019}
}