Skip to yearly menu bar Skip to main content


( events)   Timezone:  
Poster
Thu Dec 01 02:00 PM -- 04:00 PM (PST) @ Hall J #417
Smoothed Embeddings for Certified Few-Shot Learning
Mikhail Pautov · Olesya Kuznetsova · Nurislam Tursynbek · Aleksandr Petiushko · Ivan Oseledets
[ Poster [ OpenReview
Randomized smoothing is considered to be the state-of-the-art provable defense against adversarial perturbations. However, it heavily exploits the fact that classifiers map input objects to class probabilities and do not focus on the ones that learn a metric space in which classification is performed by computing distances to embeddings of class prototypes. In this work, we extend randomized smoothing to few-shot learning models that map inputs to normalized embeddings. We provide analysis of the Lipschitz continuity of such models and derive a robustness certificate against $\ell_2$-bounded perturbations that may be useful in few-shot learning scenarios. Our theoretical results are confirmed by experiments on different datasets.