Open
Description
Update Date: 2020.04.18
- A Gift from Knowledge Distillation: Fast Optimization, Network Minimization and Transfer Learning. Yim et al. KAIST. CVPR 2017.
- Similarity-Preserving Knowledge Distillation. Tung et al. Borealis AI. ICCV 2019.
- Knowledge Distillation via Route Constrained Optimization. Jin et al. SenseTime. ICCV 2019.
- A Comprehensive Overhaul of Feature Distillation. Heo et al. NAVER. ICCV 2019.
- Contrastive Representation Distillation. Tian et al. MIT. ICLR 2020.