firstbacksecondback
7 Results
Poster
|
Fri 11:00 |
Understanding the Gains from Repeated Self-Distillation Divyansh Pareek · Simon Du · Sewoong Oh |
|
Poster
|
Fri 16:30 |
Self-Distilled Depth Refinement with Noisy Poisson Fusion Jiaqi Li · Yiran Wang · Jinghong Zheng · Zihao Huang · Ke Xian · Zhiguo Cao · Jianming Zhang |
|
Poster
|
Wed 16:30 |
FineCLIP: Self-distilled Region-based CLIP for Better Fine-grained Understanding Dong Jing · Xiaolong He · Yutian Luo · Nanyi Fei · guoxing Yang · Wei Wei · Huiwen Zhao · Zhiwu Lu |
|
Workshop
|
Decreasing Inconsistencies in Differentially Private Language Models through Self-Distillation Kieleh Ngong Ivoline Clarisse · Joseph Near · Niloofar Mireshghallah |
||
Workshop
|
Self-Data Distillation for Recovering Quality in Pruned Large Language Models Vithursan Thangarasa · Ganesh Venkatesh · Nish Sinnadurai · Sean Lie |
||
Affinity Event
|
Distilling Visual Information into Symbolic Representations through Self-Supervised Learning Victor Sebastian Martinez Pozos · Ivan Vladimir Meza Ruiz |
||
Poster
|
Wed 16:30 |
How JEPA Avoids Noisy Features: The Implicit Bias of Deep Linear Self Distillation Networks Etai Littwin · Omid Saremi · Madhu Advani · Vimal Thilak · Preetum Nakkiran · Chen Huang · Joshua Susskind |