Timezone: »

A Quantitative Geometric Approach to Neural-Network Smoothness
Zi Wang · Gautam Prakriya · Somesh Jha

Wed Nov 30 09:00 AM -- 11:00 AM (PST) @ Hall J #413
Fast and precise Lipschitz constant estimation of neural networks is an important task for deep learning. Researchers have recently found an intrinsic trade-off between the accuracy and smoothness of neural networks, so training a network with a loose Lipschitz constant estimation imposes a strong regularization, and can hurt the model accuracy significantly. In this work, we provide a unified theoretical framework, a quantitative geometric approach, to address the Lipschitz constant estimation. By adopting this framework, we can immediately obtain several theoretical results, including the computational hardness of Lipschitz constant estimation and its approximability. We implement the algorithms induced from this quantitative geometric approach, which are based on semidefinite programming (SDP). Our empirical evaluation demonstrates that they are more scalable and precise than existing tools on Lipschitz constant estimation for $\ell_\infty$-perturbations. Furthermore, we also show their intricate relations with other recent SDP-based techniques, both theoretically and empirically. We believe that this unified quantitative geometric perspective can bring new insights and theoretical tools to the investigation of neural-network smoothness and robustness.

Author Information

Zi Wang (University of Wisconsin-Madison)
Gautam Prakriya (The Chinese University of Hong Kong)
Somesh Jha (University of Wisconsin, Madison)

More from the Same Authors