Timezone: »

Boosting the Performance of Generic Deep Neural Network Frameworks with Log-supermodular CRFs
Hao Xiong · Yangxiao Lu · Nicholas Ruozzi

Tue Nov 29 02:00 PM -- 04:00 PM (PST) @ Hall J #506

Historically, conditional random fields (CRFs) were popular tools in a variety of application areas from computer vision to natural language processing, but due to their higher computational cost and weaker practical performance, they have, in many situations, fallen out of favor and been replaced by end-to-end deep neural network (DNN) solutions. More recently, combined DNN-CRF approaches have been considered, but their speed and practical performance still falls short of the best performing pure DNN solutions. In this work, we present a generic combined approach in which a log-supermodular CRF acts as a regularizer to encourage similarity between outputs in a structured prediction task. We show that this combined approach is widely applicable, practical (it incurs only a moderate overhead on top of the base DNN solution) and, in some cases, it can rival carefully engineered pure DNN solutions for the same structured prediction task.

Author Information

Hao Xiong (University of Texas, Dallas)
Yangxiao Lu (University of Texas at Dallas)
Nicholas Ruozzi (UTDallas)

More from the Same Authors