Skip to yearly menu bar Skip to main content


Recycle-and-Distill: Universal Compression Strategy for Transformer-based Speech SSL Models with Attention Map Reusing and Masking Distillation

Kangwook Jang ⋅ Sungnyun Kim ⋅ Se-Young Yun ⋅ HOI RIN KIM

Abstract

Chat is not available.