Timezone: »

A Reparametrization-Invariant Sharpness Measure Based on Information Geometry
Cheongjae Jang · Sungyoon Lee · Frank Park · Yung-Kyun Noh

Tue Nov 29 02:00 PM -- 04:00 PM (PST) @ Hall J #415

It has been observed that the generalization performance of neural networks correlates with the sharpness of their loss landscape. Dinh et al. (2017) have observed that existing formulations of sharpness measures fail to be invariant with respect to scaling and reparametrization. While some scale-invariant measures have recently been proposed, reparametrization-invariant measures are still lacking. Moreover, they often do not provide any theoretical insights into generalization performance nor lead to practical use to improve the performance. Based on an information geometric analysis of the neural network parameter space, in this paper we propose a reparametrization-invariant sharpness measure that captures the change in loss with respect to changes in the probability distribution modeled by neural networks, rather than with respect to changes in the parameter values. We reveal some theoretical connections of our measure to generalization performance. In particular, experiments confirm that using our measure as a regularizer in neural network training significantly improves performance.

Author Information

Cheongjae Jang (Hanyang University)
Sungyoon Lee (Korea Institute for Advanced Study)
Frank Park (Seoul National University)
Yung-Kyun Noh (Hanyang University / Korea Institute for Advanced Study)

More from the Same Authors