`

Timezone: »

 
Tutorial
Lagrangian Relaxation Algorithms for Inference in Natural Language Processing
Alexander Rush · Michael Collins

Mon Dec 12 03:00 AM -- 05:00 AM (PST) @ Andulucia II & III

here has been a long history in combinatorial optimization of methods that exploit structure in complex problems, using methods such as dual decomposition or Lagrangian relaxation. These methods leverage the observation that complex inference problems can often be decomposed into efficiently solvable sub-problems. Recent work within the machine learning community has explored algorithms for MAP inference in graphical models using these methods, as an alternative for example to max-product loopy belief propagation.

In this tutorial, we give an overview of Lagrangian relaxation for inference problems in natural language processing. The goals of the tutorial will be two-fold:

1) to give an introduction to key inference problems in NLP: for example problems arising in machine translation, sequence modeling, parsing, and information extraction.

2) to give a formal and practical overview of Lagrangrian relaxation as a method for deriving inference algorithms for NLP problems. In general, the algorithms we describe combine combinatorial optimization methods (for example dynamic programming, exact belief propagation, minimum spanning tree, all-pairs shortest path) with subgradient methods from the optimization community. Formal guarantees for the algorithms come from a close relationship to linear programming relaxations.

For many of the problems that we consider, the resulting algorithms produce exact solutions, with certificates of optimality, on the vast majority of examples. In practice the algorithms are efficient for problems that are either NP-hard (as is the case for non-projective parsing, or for phrase-based translation), or for problems that are solvable in polynomial time using dynamic programming, but where the traditional exact algorithms are far too expensive to be practical.

Author Information

Alexander Rush (Facebook)

Alexander Rush is a Ph.D. candidate in computer science at MIT. His research explores novel decoding methods for problems in natural language processing with applications to parsing and statistical machine translation. His paper ``Dual Decomposition for Parsing with Non-Projective Head Automata'' received the best paper award at EMNLP 2010.

Michael Collins (Columbia University)

Michael Collins is the Vikram S. Pandit Professor of computer science at Columbia University. His research is focused on topics including statistical parsing, structured prediction problems in machine learning, and applications including machine translation, dialog systems, and speech recognition. His awards include a Sloan fellowship, an NSF career award, and best paper awards at EMNLP (2002, 2004, and 2010), UAI (2004 and 2005), and CoNLL 2008.

More from the Same Authors