Skip to yearly menu bar Skip to main content


Poster

SymILO: A Symmetry-Aware Learning Framework for Integer Linear Optimization

Qian Chen · Tianjian Zhang · Linxin Yang · Qingyu Han · Akang Wang · Ruoyu Sun · Xiaodong Luo · Tsung-Hui Chang

West Ballroom A-D #5902
[ ]
Fri 13 Dec 11 a.m. PST — 2 p.m. PST

Abstract:

Integer linear programs (ILPs) are commonly employed to model diverse practical problems such as scheduling and planning. Recently, machine learning techniques have been utilized to solve ILPs. A straightforward idea is to train a model via supervised learning, with an ILP as the input and its optimal solution as the label. An ILP is symmetric if its variables can be permuted without changing the problem structure, resulting in numerous equivalent and optimal solutions. Randomly selecting an optimal solution as the label can introduce variability in the training data, which may hinder the model from learning stable patterns. In this work, we incorporate the intrinsic symmetry of ILPs and propose a novel training framework called SymILO. Specifically, we modify the learning task by introducing solution permutation along with neural network weights as learnable parameters and then design an alternating algorithm to jointly optimize the loss function. We evaluated our framework on ILPs with different symmetries, and computational results demonstrate that our symmetry-aware approach significantly outperforms the symmetry-agnostic ones.

Live content is unavailable. Log in and register to view live content