Affinity Event
|
|
Thermal Image Object Detection via Cross-Modal Knowledge Distillation from RGB
Meron Mihret · Abel Mekonnen · Yeabsira Tessema
|
|
Affinity Event
|
|
Thermal Image Object Detection via Cross-Modal Knowledge Distillation from RGB
Michael Desta · Abel Mekonnen · Selameab Demilew
|
|
Workshop
|
Sun 14:00
|
Tong Wang: Using Advanced LLMs to Enhance Smaller LLMs - An Interpretable Knowledge Distillation Approach
Tong Wang
|
|
Affinity Event
|
|
What’s Left After Distillation? How Knowledge Transfer Impacts Fairness and Bias
Aida Mohammadshahi · Yani Ioannou
|
|
Workshop
|
|
TAID: Temporally Adaptive Interpolated Distillation for Efficient Knowledge Transfer in Language Models
Makoto Shing · Kou Misaki · Han Bao · Sho Yokoi · Takuya Akiba
|
|
Poster
|
Thu 16:30
|
Over-parameterized Student Model via Tensor Decomposition Boosted Knowledge Distillation
Yu-Liang Zhan · Zhong-Yi Lu · Hao Sun · Ze-Feng Gao
|
|
Poster
|
|
Wasserstein Distance Rivals Kullback-Leibler Divergence for Knowledge Distillation
Jiaming Lv · Haoyuan Yang · Peihua Li
|
|
Workshop
|
|
Improving Knowledge Distillation with Teacher's Explanation
Sayantan Chowdhury · Ben Liang · Ali Tizghadam · Ilijc Albanese
|
|
Poster
|
Thu 16:30
|
LaKD: Length-agnostic Knowledge Distillation for Trajectory Prediction with Any Length Observations
Yuhang Li · Changsheng Li · Ruilin Lv · Rongqing Li · Ye Yuan · Guoren Wang
|
|
Workshop
|
|
MolKD: Distilling Cross-Modal Knowledge in Chemical Reactions for Molecular Property Prediction
Liang Zeng
|
|
Poster
|
Wed 16:30
|
Transformers to SSMs: Distilling Quadratic Knowledge to Subquadratic Models
Aviv Bick · Kevin Li · Eric Xing · J. Zico Kolter · Albert Gu
|
|
Workshop
|
|
Progressive distillation induces an implicit curriculum
Abhishek Panigrahi · Bingbin Liu · Sadhika Malladi · Andrej Risteski · Surbhi Goel
|
|