Search All 2022 Events
 

78 Results

<<   <   Page 1 of 7   >   >>
Poster
Thu 14:00 BiT: Robustly Binarized Multi-distilled Transformer
Zechun Liu · Barlas Oguz · Aasish Pappu · Lin Xiao · Scott Yih · Meng Li · Raghuraman Krishnamoorthi · Yashar Mehdad
Poster
Tue 9:00 Spectral Bias in Practice: The Role of Function Frequency in Generalization
Sara Fridovich-Keil · Raphael Gontijo Lopes · Rebecca Roelofs
Poster
Thu 9:00 Meta-DMoE: Adapting to Domain Shift by Meta-Distillation from Mixture-of-Experts
Tao Zhong · Zhixiang Chi · Li Gu · Yang Wang · Yuanhao Yu · Jin Tang
Workshop
SADT: Combining Sharpness-Aware Minimization with Self-Distillation for Improved Model Generalization
MASUD AN NUR ISLAM FAHIM · Jani Boutellier
Poster
Wed 9:00 Weighted Distillation with Unlabeled Examples
Fotis Iliopoulos · Vasilis Kontonis · Cenk Baykal · Gaurav Menghani · Khoa Trinh · Erik Vee
Poster
Tue 14:00 Preservation of the Global Knowledge by Not-True Distillation in Federated Learning
Gihun Lee · Minchan Jeong · Yongjin Shin · Sangmin Bae · Se-Young Yun
Poster
Wed 9:00 Distilled Gradient Aggregation: Purify Features for Input Attribution in the Deep Neural Network
Giyoung Jeon · Haedong Jeong · Jaesik Choi
Poster
Thu 9:00 Geometric Knowledge Distillation: Topology Compression for Graph Neural Networks
Chenxiao Yang · Qitian Wu · Junchi Yan
Poster
Tue 9:00 Distilling Representations from GAN Generator via Squeeze and Span
Yu Yang · Xiaotian Cheng · Chang Liu · Hakan Bilen · Xiangyang Ji
Poster
Knowledge Distillation Improves Graph Structure Augmentation for Graph Neural Networks
Lirong Wu · Haitao Lin · Yufei Huang · Stan Z. Li
Poster
Wed 14:00 Symbolic Distillation for Learned TCP Congestion Control
S P Sharan · Wenqing Zheng · Kuo-Feng Hsu · Jiarong Xing · Ang Chen · Zhangyang Wang
Poster
PKD: General Distillation Framework for Object Detectors via Pearson Correlation Coefficient
Weihan Cao · Yifan Zhang · Jianfei Gao · Anda Cheng · Ke Cheng · Jian Cheng