Timezone: »
Poster
Faster Randomized Infeasible Interior Point Methods for Tall/Wide Linear Programs
Agniva Chowdhury · Palma London · Haim Avron · Petros Drineas
Linear programming (LP) is used in many machine learning applications, such as $\ell_1$-regularized SVMs, basis pursuit, nonnegative matrix factorization, etc. Interior Point Methods (IPMs) are one of the most popular methods to solve LPs both in theory and in practice. Their underlying complexity is dominated by the cost of solving a system of linear equations at each iteration. In this paper, we consider \emph{infeasible} IPMs for the special case where the number of variables is much larger than the number of constraints (i.e., wide), or vice-versa (i.e., tall) by taking the dual. Using tools from Randomized Linear Algebra, we present a preconditioning technique that, when combined with the Conjugate Gradient iterative solver, provably guarantees that infeasible IPM algorithms (suitably modified to account for the error incurred by the approximate solver), converge to a feasible, approximately optimal solution, without increasing their iteration complexity. Our empirical evaluations verify our theoretical results on both real and synthetic data.
Author Information
Agniva Chowdhury (Purdue University)
Palma London (Cornell)
Haim Avron (Tel Aviv University)
Petros Drineas (Purdue University)
More from the Same Authors
-
2023 Poster: Sketching Algorithms for Sparse Dictionary Learning: PTAS and Turnstile Streaming »
Gregory Dexter · Petros Drineas · David Woodruff · Taisuke Yasuda -
2023 Poster: Refined Mechanism Design for Approximately Structured Priors via Active Regression »
Christos Boutsikas · Petros Drineas · Marios Mertzanidis · Alexandros Psomas · Paritosh Verma -
2021 Poster: Scaling Neural Tangent Kernels via Sketching and Random Features »
Amir Zandieh · Insu Han · Haim Avron · Neta Shoham · Chaewon Kim · Jinwoo Shin