Skip to yearly menu bar Skip to main content

Workshop: OPT 2022: Optimization for Machine Learning

Fast Convergence of Greedy 2-Coordinate Updates for Optimizing with an Equality Constraint

Amrutha Varshini Ramesh · Aaron Mishkin · Mark Schmidt


In this work, we study the Block Coordinate Descent (BCD) algorithm with greedy block selection rule for minimizing functions with one linear equality constraint. We show a faster linear convergence rate for BCD with block size 2 (2-BCD) on functions satisfying the Polyak-Lojasiewicz inequality. Our analysis exploits similarity between the solutions of 2-BCD and equality-constrained steepest descent (SD) in the L1-norm. This yields a simple proof.

Chat is not available.