Skip to yearly menu bar Skip to main content

Workshop: OPT 2023: Optimization for Machine Learning

Reducing Predict and Optimize to Convex Feasibility

Saurabh Mishra · Sharan Vaswani


Numerous applications in operations research and computer science require a combination of prediction and optimization -- use historical data to predict the parameters of an optimization problem, and solve the optimization problem to output a decision. Addressing these two challenges independently results in the \emph{predict-then-optimize} problem. This approach can result in discrepancies between the prediction error, minimized during training, and the ultimate objective of minimizing the decision error. Consequently, recent work has focused on the \emph{predict and optimize} (PO) framework which focuses on training an end-to-end model from the data to the decisions. We focus on linear programs (LPs) within the PO framework where the main challenge is handling the non-differentiability of LPs. For a linear prediction model, we present a novel reduction from PO to a convex feasibility problem. This reduction enables us to use alternating projections onto convex sets for solving the PO problem, resulting in a computationally-efficient and theoretically principled algorithm. Finally, we validate the effectiveness of our approach on synthetic shortest path and fractional knapsack problems, demonstrating improved performance compared to the prior work.

Chat is not available.