Skip to yearly menu bar Skip to main content


Poster

Zero-shot Learning via Simultaneous Generating and Learning

Hyeonwoo Yu · Beomhee Lee

East Exhibition Hall B + C #25

Keywords: [ Algorithms -> Missing Data; Deep Learning -> Deep Autoencoders; Deep Learning ] [ Generative Models ] [ Few-Shot Learning ] [ Algorithms ]


Abstract:

To overcome the absence of training data for unseen classes, conventional zero-shot learning approaches mainly train their model on seen datapoints and leverage the semantic descriptions for both seen and unseen classes. Beyond exploiting relations between classes of seen and unseen, we present a deep generative model to provide the model with experience about both seen and unseen classes. Based on the variational auto-encoder with class-specific multi-modal prior, the proposed method learns the conditional distribution of seen and unseen classes. In order to circumvent the need for samples of unseen classes, we treat the non-existing data as missing examples. That is, our network aims to find optimal unseen datapoints and model parameters, by iteratively following the generating and learning strategy. Since we obtain the conditional generative model for both seen and unseen classes, classification as well as generation can be performed directly without any off-the-shell classifiers. In experimental results, we demonstrate that the proposed generating and learning strategy makes the model achieve the outperforming results compared to that trained only on the seen classes, and also to the several state-of-the-art methods.

Live content is unavailable. Log in and register to view live content