Timezone: »

Parallel Direction Method of Multipliers
Huahua Wang · Arindam Banerjee · Zhi-Quan Luo

Mon Dec 08 04:00 PM -- 08:59 PM (PST) @ Level 2, room 210D #None

We consider the problem of minimizing block-separable convex functions subject to linear constraints. While the Alternating Direction Method of Multipliers (ADMM) for two-block linear constraints has been intensively studied both theoretically and empirically, in spite of some preliminary work, effective generalizations of ADMM to multiple blocks is still unclear. In this paper, we propose a parallel randomized block coordinate method named Parallel Direction Method of Multipliers (PDMM) to solve the optimization problems with multi-block linear constraints. PDMM randomly updates some blocks in parallel, behaving like parallel randomized block coordinate descent. We establish the global convergence and the iteration complexity for PDMM with constant step size. We also show that PDMM can do randomized block coordinate descent on overlapping blocks. Experimental results show that PDMM performs better than state-of-the-arts methods in two applications, robust principal component analysis and overlapping group lasso.

Author Information

Huahua Wang (University of Minnesota, Twin Cites)
Arindam Banerjee (University of Minnesota, Twin Cities)

Arindam Banerjee is a Professor at the Department of Computer & Engineering and a Resident Fellow at the Institute on the Environment at the University of Minnesota, Twin Cities. His research interests are in machine learning, data mining, and applications in complex real-world problems in different areas including climate science, ecology, recommendation systems, text analysis, and finance. He has won several awards, including the NSF CAREER award (2010), the IBM Faculty Award (2013), and six best paper awards in top-tier conferences.

Zhi-Quan Luo (University of Minnesota, Twin Cites)

More from the Same Authors