Timezone: »
Exponential families are widely used in machine learning; they include many distributions in continuous and discrete domains (e.g., Gaussian, Dirichlet, Poisson, and categorical distributions via the softmax transformation). Distributions in each of these families have fixed support. In contrast, for finite domains, there has been recent work on sparse alternatives to softmax (e.g., sparsemax and alpha-entmax), which have varying support, being able to assign zero probability to irrelevant categories. These discrete sparse mappings have been used for improving interpretability of neural attention mechanisms. This paper expands that work in two directions: first, we extend alpha-entmax to continuous domains, revealing a link with Tsallis statistics and deformed exponential families. Second, we introduce continuous-domain attention mechanisms, deriving efficient gradient backpropagation algorithms for alpha in {1,2}. Experiments on attention-based text classification, machine translation, and visual question answering illustrate the use of continuous attention in 1D and 2D, showing that it allows attending to time intervals and compact regions.
Author Information
André Martins (Instituto de Telecomunicacoes (NIF: 502 854 200))
António Farinhas (Instituto de Telecomunicações, Instituto Superior Técnico)
Marcos Treviso (Instituto de Telecomunicacoes)
Vlad Niculae (Instituto de Telecomunicações)
Pedro Aguiar (Instituto Superior Técnico)
Mario Figueiredo (University of Lisbon)
Related Events (a corresponding poster, oral, or spotlight)
-
2020 Poster: Sparse and Continuous Attention Mechanisms »
Thu. Dec 10th 05:00 -- 07:00 PM Room Poster Session 5 #1414
More from the Same Authors
-
2021 : COCO Denoiser: Using Co-Coercivity for Variance Reduction in Stochastic Convex Optimization »
Manuel Madeira · Renato Negrinho · Joao Xavier · Pedro Aguiar -
2022 Poster: Learning to Scaffold: Optimizing Model Explanations for Teaching »
Patrick Fernandes · Marcos Treviso · Danish Pruthi · André Martins · Graham Neubig -
2020 Poster: Efficient Marginalization of Discrete and Structured Latent Variables via Sparsity »
Gonçalo Correia · Vlad Niculae · Wilker Aziz · André Martins -
2020 Spotlight: Efficient Marginalization of Discrete and Structured Latent Variables via Sparsity »
Gonçalo Correia · Vlad Niculae · Wilker Aziz · André Martins