Timezone: »
We propose DecaProp (Densely Connected Attention Propagation), a new densely connected neural architecture for reading comprehension (RC). There are two distinct characteristics of our model. Firstly, our model densely connects all pairwise layers of the network, modeling relationships between passage and query across all hierarchical levels. Secondly, the dense connectors in our network are learned via attention instead of standard residual skip-connectors. To this end, we propose novel Bidirectional Attention Connectors (BAC) for efficiently forging connections throughout the network. We conduct extensive experiments on four challenging RC benchmarks. Our proposed approach achieves state-of-the-art results on all four, outperforming existing baselines by up to 2.6% to 14.2% in absolute F1 score.
Author Information
Yi Tay (Nanyang Technological University)
Anh Tuan Luu (Institute for Infocomm Research)
Siu Cheung Hui (Nanyang Technological University)
Jian Su (I2R, Singapore)
More from the Same Authors
-
2021 Poster: Self-Instantiated Recurrent Units with Dynamic Soft Recursion »
Aston Zhang · Yi Tay · Yikang Shen · Alvin Chan · SHUAI ZHANG -
2019 Poster: Compositional De-Attention Networks »
Yi Tay · Anh Tuan Luu · Aston Zhang · Shuohang Wang · Siu Cheung Hui -
2019 Poster: Quaternion Knowledge Graph Embeddings »
SHUAI ZHANG · Yi Tay · Lina Yao · Qi Liu -
2018 Poster: Recurrently Controlled Recurrent Networks »
Yi Tay · Anh Tuan Luu · Siu Cheung Hui