knowledge-distillation

Good Teachers Explain: Explanation-Enhanced Knowledge Distillation

Knowledge Distillation (KD) has proven effective for compressing large teacher models into smaller student models. While it is well known that student models can achieve similar accuracies as the teachers, it has also been shown that they nonetheless …