Timezone: »
The dominant approaches to text representation in natural language rely on learning embeddings on massive corpora which have convenient properties such as compositionality and distance preservation. In this paper, we develop a novel method to learn a heavy-tailed embedding with desirable regularity properties regarding the distributional tails, which allows to analyze the points far away from the distribution bulk using the framework of multivariate extreme value theory. In particular, a classifier dedicated to the tails of the proposed embedding is obtained which exhibits a scale invariance property exploited in a novel text generation method for label preserving dataset augmentation. Experiments on synthetic and real text data show the relevance of the proposed framework and confirm that this method generates meaningful sentences with controllable attribute, e.g. positive or negative sentiments.
Author Information
Hamid Jalalzai (Telecom ParisTech)
Pierre Colombo (Telecom ParisTech)
Chloé Clavel (Telecom-ParisTech, Paris, France)
Eric Gaussier (Université Joseph Fourier, Grenoble)
Giovanna Varni (Telecom ParisTec)
Emmanuel Vignon (IBM)
Anne Sabourin (LTCI, Telecom ParisTech, Université Paris-Saclay)
More from the Same Authors
-
2022 : Membership Inference Attacks via Adversarial Examples »
Hamid Jalalzai · Elie Kadoche · Rémi Leluc · Vincent Plassier -
2020 Poster: Smooth And Consistent Probabilistic Regression Trees »
Sami Alkhoury · Emilie Devijver · Marianne Clausel · Myriam Tami · Eric Gaussier · georges Oppenheim -
2018 Poster: On Binary Classification in Extreme Regions »
Hamid Jalalzai · Stephan Clémençon · Anne Sabourin -
2013 Poster: On Flat versus Hierarchical Classification in Large-Scale Taxonomies »
Rohit Babbar · Ioannis Partalas · Eric Gaussier · Massih R. Amini