Timezone: »

 
Poster
Learning to Learn Variational Semantic Memory
Xiantong Zhen · Yingjun Du · Huan Xiong · Qiang Qiu · Cees Snoek · Ling Shao

Wed Dec 09 09:00 AM -- 11:00 AM (PST) @ Poster Session 3 #774

In this paper, we introduce variational semantic memory into meta-learning to acquire long-term knowledge for few-shot learning. The variational semantic memory accrues and stores semantic information for the probabilistic inference of class prototypes in a hierarchical Bayesian framework. The semantic memory is grown from scratch and gradually consolidated by absorbing information from tasks it experiences. By doing so, it is able to accumulate long-term, general knowledge that enables it to learn new concepts of objects. We formulate memory recall as the variational inference of a latent memory variable from addressed contents, which offers a principled way to adapt the knowledge to individual tasks. Our variational semantic memory, as a new long-term memory module, confers principled recall and update mechanisms that enable semantic information to be efficiently accrued and adapted for few-shot learning. Experiments demonstrate that the probabilistic modelling of prototypes achieves a more informative representation of object classes compared to deterministic vectors. The consistent new state-of-the-art performance on four benchmarks shows the benefit of variational semantic memory in boosting few-shot recognition.

Author Information

Xiantong Zhen (University of Amsterdam)
Yingjun Du (University of Amsterdam)
Huan Xiong (Mohamed bin Zayed University of Artificial Intelligence (MBZUAI))
Qiang Qiu (Purdue University)
Cees Snoek (University of Amsterdam)
Ling Shao (Inception Institute of Artificial Intelligence)

More from the Same Authors