Fri 7:00 a.m. - 7:05 a.m.
|
Opening remarks
(
Opening remarks
)
>
link
SlidesLive Video
|
馃敆
|
Fri 7:05 a.m. - 7:25 a.m.
|
Attention in Task-sets, Planning, and the Prefrontal Cortex
(
Invited talk
)
>
link
SlidesLive Video
|
Ida Momennejad
馃敆
|
Fri 7:25 a.m. - 7:45 a.m.
|
Relating transformers to models and neural representations of the hippocampal formation
(
Invited talk
)
>
link
SlidesLive Video
|
James Whittington
馃敆
|
Fri 7:45 a.m. - 8:05 a.m.
|
Eye Gaze in Human-Robot Collaboration
(
Invited talk
)
>
link
SlidesLive Video
|
Henny Admoni
馃敆
|
Fri 8:05 a.m. - 8:25 a.m.
|
Attending to What's Not There
(
Invited talk
)
>
link
SlidesLive Video
|
Tobias Gerstenberg
馃敆
|
Fri 8:25 a.m. - 8:35 a.m.
|
Foundations of Attention Mechanisms in Deep Neural Network Architectures
(
Spotlight
)
>
link
SlidesLive Video
|
Pierre Baldi 路 Roman Vershynin
馃敆
|
Fri 8:35 a.m. - 8:45 a.m.
|
Is Attention Interpretation? A Quantitative Assessment On Sets
(
Spotlight
)
>
link
SlidesLive Video
|
Jonathan D. Haab 路 Nicolas Deutschmann 路 Maria Rodriguez Martinez
馃敆
|
Fri 9:00 a.m. - 10:00 a.m.
|
Panel I (in-person)
(
Panel Discussion
)
>
link
SlidesLive Video
|
馃敆
|
Fri 10:00 a.m. - 11:00 a.m.
|
Lunch
|
馃敆
|
Fri 11:00 a.m. - 12:00 p.m.
|
Poster session + coffee break
(
Poster session
)
>
link
|
馃敆
|
Fri 12:00 p.m. - 12:20 p.m.
|
Exploiting Human Interactions to Learn Human Attention
(
Invited talk
)
>
link
SlidesLive Video
|
Shalini De Mello
馃敆
|
Fri 12:20 p.m. - 12:40 p.m.
|
BrainProp: How Attentional Processes in the Brain Solve the Credit Assignment Problem
(
Invited talk
)
>
link
SlidesLive Video
|
Pieter Roelfsema
馃敆
|
Fri 12:40 p.m. - 1:00 p.m.
|
Attention as Interpretable Information Processing in Machine Learning Systems
(
Invited talk
)
>
link
SlidesLive Video
|
Erin Grant
馃敆
|
Fri 1:00 p.m. - 1:20 p.m.
|
Accelerating human attention research via ML applied to smartphones
(
Invited talk
)
>
link
SlidesLive Video
|
Vidhya Navalpakkam
馃敆
|
Fri 1:20 p.m. - 1:30 p.m.
|
Wide Attention Is The Way Forward For Transformers
(
Spotlight
)
>
link
SlidesLive Video
|
Jason Brown 路 Yiren Zhao 路 I Shumailov 路 Robert Mullins
馃敆
|
Fri 1:30 p.m. - 1:40 p.m.
|
Fine-tuning hierarchical circuits through learned stochastic co-modulation
(
Spotlight
)
>
link
SlidesLive Video
|
Caroline Haimerl 路 Eero Simoncelli 路 Cristina Savin
馃敆
|
Fri 1:40 p.m. - 1:50 p.m.
|
Hierarchical Abstraction for Combinatorial Generalization in Object Rearrangement
(
Spotlight
)
>
link
SlidesLive Video
|
Michael Chang 路 Alyssa L Dayan 路 Franziska Meier 路 Tom Griffiths 路 Sergey Levine 路 Amy Zhang
馃敆
|
Fri 2:00 p.m. - 3:00 p.m.
|
Poster session + coffee break
(
Poster session
)
>
link
|
馃敆
|
Fri 3:00 p.m. - 3:55 p.m.
|
Panel II (virtual)
(
Panel discussion
)
>
link
SlidesLive Video
|
馃敆
|
Fri 3:55 p.m. - 4:00 p.m.
|
Closing remarks
(
Closing remarks
)
>
|
馃敆
|
-
|
Bounded logit attention: Learning to explain image classifiers
(
Poster
)
>
|
Thomas Baumhauer 路 Djordje Slijepcevic 路 Matthias Zeppelzauer
馃敆
|
-
|
TDLR: Top Semantic-Down Syntactic Language Representation
(
Poster
)
>
|
Vipula Rawte 路 Megha Chakraborty 路 Kaushik Roy 路 Manas Gaur 路 Keyur Faldu 路 Prashant Kikani 路 Amit Sheth
馃敆
|
-
|
Attention for Compositional Modularity
(
Poster
)
>
|
Oleksiy Ostapenko 路 Pau Rodriguez 路 Alexandre Lacoste 路 Laurent Charlin
馃敆
|
-
|
Systematic Generalization and Emergent Structures in Transformers Trained on Structured Tasks
(
Poster
)
>
|
Yuxuan Li 路 James McClelland
馃敆
|
-
|
The Paradox of Choice: On the Role of Attention in Hierarchical Reinforcement Learning
(
Poster
)
>
|
Andrei Nica 路 Khimya Khetarpal 路 Doina Precup
馃敆
|
-
|
FuzzyNet: A Fuzzy Attention Module for Polyp Segmentation
(
Poster
)
>
|
Krushi Patel 路 Guanghui Wang 路 Fengjun Li
馃敆
|
-
|
Is Attention Interpretation? A Quantitative Assessment On Sets
(
Poster
)
>
|
Jonathan D. Haab 路 Nicolas Deutschmann 路 Maria Rodriguez Martinez
馃敆
|
-
|
Wide Attention Is The Way Forward For Transformers
(
Poster
)
>
|
Jason Brown 路 Yiren Zhao 路 I Shumailov 路 Robert Mullins
馃敆
|
-
|
Attention as inference with third-order interactions
(
Poster
)
>
|
Yicheng Fei 路 Xaq Pitkow
馃敆
|
-
|
Hierarchical Abstraction for Combinatorial Generalization in Object Rearrangement
(
Poster
)
>
|
Michael Chang 路 Alyssa L Dayan 路 Franziska Meier 路 Tom Griffiths 路 Sergey Levine 路 Amy Zhang
馃敆
|
-
|
Improving cross-modal attention via object detection
(
Poster
)
>
|
Yongil Kim 路 Yerin Hwang 路 Seunghyun Yoon 路 HyeonGu Yun 路 Kyomin Jung
馃敆
|
-
|
Graph Attention for Spatial Prediction
(
Poster
)
>
|
Corban Rivera 路 Ryan Gardner
馃敆
|
-
|
Faster Attention Is What You Need: A Fast Self-Attention Neural Network Backbone Architecture for the Edge via Double-Condensing Attention Condensers
(
Poster
)
>
|
Alexander Wong 路 Mohammad Javad Shafiee 路 Saad Abbasi 路 Saeejith Nair 路 Mahmoud Famouri
馃敆
|
-
|
Fine-tuning hierarchical circuits through learned stochastic co-modulation
(
Poster
)
>
|
Caroline Haimerl 路 Eero Simoncelli 路 Cristina Savin
馃敆
|
-
|
First De-Trend then Attend: Rethinking Attention for Time-Series Forecasting
(
Poster
)
>
|
Xiyuan Zhang 路 Xiaoyong Jin 路 Karthick Gopalswamy 路 Gaurav Gupta 路 Youngsuk Park 路 Xingjian Shi 路 Hao Wang 路 Danielle Maddix 路 Yuyang (Bernie) Wang
馃敆
|
-
|
Quantifying attention via dwell time and engagement in a social media browsing environment
(
Poster
)
>
|
Ziv Epstein 路 Hause Lin 路 Gordon Pennycook 路 David Rand
馃敆
|
-
|
Revisiting Attention Weights as Explanations from an Information Theoretic Perspective
(
Poster
)
>
|
Bingyang Wen 路 Koduvayur (Suba) Subbalakshmi 路 Fan Yang
馃敆
|
-
|
Foundations of Attention Mechanisms in Deep Neural Network Architectures
(
Poster
)
>
|
Pierre Baldi 路 Roman Vershynin
馃敆
|
-
|
Unlocking Slot Attention by Changing Optimal Transport Costs
(
Poster
)
>
|
Yan Zhang 路 David Zhang 路 Simon Lacoste-Julien 路 Gertjan Burghouts 路 Cees Snoek
馃敆
|