`

Timezone: »

 
Poster
Semialgebraic Optimization for Lipschitz Constants of ReLU Networks
Tong Chen · Jean Lasserre · Victor Magron · Edouard Pauwels

Thu Dec 10 09:00 AM -- 11:00 AM (PST) @ Poster Session 5 #1632

The Lipschitz constant of a network plays an important role in many applications of deep learning, such as robustness certification and Wasserstein Generative Adversarial Network. We introduce a semidefinite programming hierarchy to estimate the global and local Lipschitz constant of a multiple layer deep neural network. The novelty is to combine a polynomial lifting for ReLU functions derivatives with a weak generalization of Putinar's positivity certificate. This idea could also apply to other, nearly sparse, polynomial optimization problems in machine learning. We empirically demonstrate that our method provides a trade-off with respect to state of the art linear programming approach, and in some cases we obtain better bounds in less time.

Author Information

Tong Chen (LAAS-CNRS)
Jean Lasserre (lasserre@laas.fr)
Victor Magron (LAAS-CNRS)
Edouard Pauwels (IRIT)

More from the Same Authors