Timezone: »
Universally modeling all typical information extraction tasks (UIE) with one generative language model (GLM) has revealed great potential by the latest study, where various IE predictions are unified into a linearized hierarchical expression under a GLM. Syntactic structure information, a type of effective feature which has been extensively utilized in IE community, should also be beneficial to UIE. In this work, we propose a novel structure-aware GLM, fully unleashing the power of syntactic knowledge for UIE. A heterogeneous structure inductor is explored to unsupervisedly induce rich heterogeneous structural representations by post-training an existing GLM. In particular, a structural broadcaster is devised to compact various latent trees into explicit high-order forests, helping to guide a better generation during decoding. We finally introduce a task-oriented structure fine-tuning mechanism, further adjusting the learned structures to most coincide with the end-task's need. Over 12 IE benchmarks across 7 tasks our system shows significant improvements over the baseline UIE system. Further in-depth analyses show that our GLM learns rich task-adaptive structural bias that greatly resolves the UIE crux, the long-range dependence issue and boundary identifying.
Author Information
Hao Fei (National University of Singapore)
Shengqiong Wu (National University of Singapore)
Jingye Li (Wuhan University)
Bobo Li (Wuhan University)
Fei Li (UMASS Lowell)
Libo Qin (Harbin Institute of Technology)
Meishan Zhang (Harbin Institute of Technology (Shenzhen), China)
Min Zhang (Harbin Institute of Technology, Shenzhen)
Tat-Seng Chua (National Univ. of Singapore)
More from the Same Authors
-
2022 Poster: Incorporating Bias-aware Margins into Contrastive Loss for Collaborative Filtering »
An Zhang · Wenchang Ma · Xiang Wang · Tat-Seng Chua -
2021 Poster: Towards Multi-Grained Explainability for Graph Neural Networks »
Xiang Wang · Yingxin Wu · An Zhang · Xiangnan He · Tat-Seng Chua -
2019 Poster: Learning to Self-Train for Semi-Supervised Few-Shot Classification »
Xinzhe Li · Qianru Sun · Yaoyao Liu · Qin Zhou · Shibao Zheng · Tat-Seng Chua · Bernt Schiele