Knowledge Distillation is a great way to reduce model complexity. Today I will explain student-teacher models. 🧵
Knowledge Distillation is a great way to reduce model complexity. Today I will explain student-teacher models. 🧵