Skip to yearly menu bar Skip to main content


Search All 2022 Events
 

48 Results

<<   <   Page 4 of 4   >>   >
Workshop
Collective Knowledge Graph Completion with Mutual Knowledge Distillation
Weihang Zhang · Ovidiu Serban · Jiahao Sun · Yike Guo
Workshop
Fri 7:35 Collective Knowledge Graph Completion with Mutual Knowledge Distillation
Weihang Zhang · Ovidiu Serban · Jiahao Sun · Yike Guo
Workshop
An Efficient Memory-Augmented Transformer for Knowledge-Intensive NLP Tasks
Yuxiang Wu · Yu Zhao · Baotian Hu · Pasquale Minervini · Pontus Lars Erik Saito Stenetorp · Sebastian Riedel
Workshop
Parameter-Efficient Low-Resource Dialogue State Tracking by Prompt Tuning
Mingyu Derek Ma · Jiun-Yu Kao · Shuyang Gao · arpit gupta · Di Jin · Tagyoung Chung · Nanyun Peng
Workshop
Fri 7:56 Fast DistilBERT on CPUs
Haihao Shen · Ofir Zafrir · Bo Dong · Hengyu Meng · Xinyu Ye · Zhe Wang · Yi Ding · Hanwen Chang · Guy Boudoukh · Moshe Wasserblat
Workshop
Fast DistilBERT on CPUs
Haihao Shen · Ofir Zafrir · Bo Dong · Hengyu Meng · Xinyu Ye · Zhe Wang · Yi Ding · Hanwen Chang · Guy Boudoukh · Moshe Wasserblat
Workshop
Using Selective Masking as a Bridge between Pre-training and Fine-tuning
Tanish Lad · Himanshu Maheshwari · Shreyas Kottukkal · Radhika Mamidi
Workshop
Using Informative Data Subsets for Efficient Training of Large Language Models: An Initial Study
H S V N S Kowndinya Renduchintala · Krishnateja Killamsetty · Sumit Bhatia · Milan Aggarwal · Ganesh Ramakrishnan · Rishabh Iyer
Workshop
Fri 10:05 Efficient Few-Shot Learning Without Prompts
Oren Pereg · Daniel Korat · Moshe Wasserblat · Lewis Tunstall · Unso Eun Seo Jo · Luke Bates · Nils Reimers
Workshop
Efficient Few-Shot Learning Without Prompts
Oren Pereg · Daniel Korat · Moshe Wasserblat · Lewis Tunstall · Unso Eun Seo Jo · Luke Bates · Nils Reimers
Workshop
Improving the Robustness of DistilHuBERT to Unseen Noisy Conditions via Data Augmentation, Curriculum Learning, and Multi-Task Enhancement
Heitor Guimarães · Arthur Pimentel · Anderson R. Avila · Mehdi Rezaghoizadeh · Tiago H Falk
Workshop
Fri 12:50 Improving the Robustness of DistilHuBERT to Unseen Noisy Conditions via Data Augmentation, Curriculum Learning, and Multi-Task Enhancement
Heitor Guimarães · Arthur Pimentel · Anderson R. Avila · Mehdi Rezaghoizadeh · Tiago H Falk