`

Timezone: »

 
Poster
Does Knowledge Distillation Really Work?
Samuel Stanton · Pavel Izmailov · Polina Kirichenko · Alexander A Alemi · Andrew Wilson

Tue Dec 07 04:30 PM -- 06:00 PM (PST) @ None #None

Knowledge distillation is a popular technique for training a small student network to emulate a larger teacher model, such as an ensemble of networks. We show that while knowledge distillation can improve student generalization, it does not typically work as it is commonly understood: there often remains a surprisingly large discrepancy between the predictive distributions of the teacher and the student, even in cases when the student has the capacity to perfectly match the teacher. We identify difficulties in optimization as a key reason for why the student is unable to match the teacher. We also show how the details of the dataset used for distillation play a role in how closely the student matches the teacher --- and that more closely matching the teacher paradoxically does not always lead to better student generalization.

Author Information

Samuel Stanton (New York University)

Sam is a Ph.D. student in the NYU Center for Data Science and a NDSEG Fellow (class of 2018), working with Professor Andrew Wilson. His current research focuses on the incorporation of probabilistic state transition models in reinforcement learning algorithms. Model-based RL agents generalize from past experience very effectively, allowing the agent to evaluate policies with fewer environment interactions than their model-free counterparts. Improving the data-efficiency of RL agents is crucial for real-world applications in fields like robotics, logistics, and finance. Sam holds a Master’s degree in Operations Research from Cornell University, where he started working with Professor Wilson as a first-year Ph.D. student. Sam transferred from the Cornell doctoral program to continue his research agenda at NYU with his advisor. Prior to his studies at Cornell, Sam earned a Bachelor’s degree in Mathematics from the University of Colorado Denver, graduating summa cum laude. In addition to his dissertation research, Sam is interested in modern art and philosophy, especially epistemology and ethics. When he is not occupied with research, Sam enjoys volleyball, rock climbing, surfing, and snowboarding.

Pavel Izmailov (New York University)
Polina Kirichenko (New York University)
Alex A Alemi (Disney Research)
Andrew Wilson (New York University)

I am a professor of machine learning at New York University.

More from the Same Authors