firstbacksecondback
31 Results
Workshop
|
Knowledge Distillation for Teaching Symmetry Invariances Patrick Odagiu · Nicole Nobili · Fabian Dionys Schrag · Yves Bicker · Yuhui Ding |
||
Workshop
|
Knowledge Distillation-Based Model Extraction Attack using GAN-based Private Counterfactual Explanations Fatima Ezzeddine · Omran Ayoub · Silvia Giordano |
||
Workshop
|
Sun 11:20 |
Testing knowledge distillation theories with dataset size Giulia Lanzillotta · Felix Sarnthein · Gil Kur · Thomas Hofmann · Bobby He |
|
Poster
|
Wed 11:00 |
VeXKD: The Versatile Integration of Cross-Modal Fusion and Knowledge Distillation for 3D Perception JI Yuzhe · Yijie CHEN · Liuqing Yang · Ding Rui · Meng Yang · Xinhu Zheng |
|
Poster
|
Thu 16:30 |
CIFD: Controlled Information Flow to Enhance Knowledge Distillation Yashas Malur Saidutta · Rakshith Sharma Srinivasa · Jaejin Cho · Ching-Hua Lee · Chouchang Yang · Yilin Shen · Hongxia Jin |
|
Workshop
|
KD-LoRA: A Hybrid Approach to Efficient Fine-Tuning with LoRA and Knowledge Distillation Rambod Azimi · Rishav Rishav · Marek Teichmann · Samira Ebrahimi Kahou |
||
Poster
|
Thu 11:00 |
DDK: Distilling Domain Knowledge for Efficient Large Language Models Jiaheng Liu · Chenchen Zhang · Jinyang Guo · Yuanxing Zhang · Haoran Que · Ken Deng · ZhiqiBai zhiqi · Jie Liu · Ge Zhang · JiakaiWang · Yanan Wu · Congnan Liu · Jiamang Wang · Lin Qu · Wenbo Su · Bo Zheng |