Timezone: »

Loss Surfaces, Mode Connectivity, and Fast Ensembling of DNNs
Timur Garipov · Pavel Izmailov · Dmitrii Podoprikhin · Dmitry Vetrov · Andrew Wilson

Tue Dec 04 01:20 PM -- 01:25 PM (PST) @ Room 220 E

The loss functions of deep neural networks are complex and their geometric properties are not well understood. We show that the optima of these complex loss functions are in fact connected by simple curves, over which training and test accuracy are nearly constant. We introduce a training procedure to discover these high-accuracy pathways between modes. Inspired by this new geometric insight, we also propose a new ensembling method entitled Fast Geometric Ensembling (FGE). Using FGE we can train high-performing ensembles in the time required to train a single model. We achieve improved performance compared to the recent state-of-the-art Snapshot Ensembles, on CIFAR-10, CIFAR-100, and ImageNet.

Author Information

Timur Garipov (Moscow State University)
Pavel Izmailov (Cornell University)
Dmitrii Podoprikhin (XTX markets)
Dmitry Vetrov (Higher School of Economics, Samsung AI Center, Moscow)
Andrew Wilson (Cornell University)
Andrew Wilson

I am a professor of machine learning at New York University.

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors