Skip to yearly menu bar Skip to main content


Poster

Generalization Bound and Learning Methods for Data-Driven Projections in Linear Programming

Shinsaku Sakaue · Taihei Oki

West Ballroom A-D #6006
[ ] [ Project Page ]
[ Paper [ Slides [ Poster [ OpenReview
Wed 11 Dec 4:30 p.m. PST — 7:30 p.m. PST

Abstract: How to solve high-dimensional linear programs (LPs) efficiently is a fundamental question.Recently, there has been a surge of interest in reducing LP sizes using *random projections*, which can accelerate solving LPs independently of improving LP solvers. This paper explores a new direction of *data-driven projections*, which use projection matrices learned from data instead of random projection matrices.Given training data of n-dimensional LPs, we learn an n×k projection matrix with n>k. When addressing a future LP instance, we reduce its dimensionality from n to k via the learned projection matrix, solve the resulting LP to obtain a k-dimensional solution, and apply the learned matrix to it to recover an n-dimensional solution.On the theoretical side, a natural question is: how much data is sufficient to ensure the quality of recovered solutions? We address this question based on the framework of *data-driven algorithm design*, which connects the amount of data sufficient for establishing generalization bounds to the *pseudo-dimension* of performance metrics. We obtain an O~(nk2) upper bound on the pseudo-dimension, where O~ compresses logarithmic factors. We also provide an Ω(nk) lower bound, implying our result is tight up to an O~(k) factor. On the practical side, we explore two simple methods for learning projection matrices: PCA- and gradient-based methods. While the former is relatively efficient, the latter can sometimes achieve better solution quality. Experiments demonstrate that learning projection matrices from data is indeed beneficial: it leads to significantly higher solution quality than the existing random projection while greatly reducing the time for solving LPs.

Chat is not available.