Timezone: »

 
Poster
Exact Solutions of a Deep Linear Network
Liu Ziyin · Botao Li · Xiangming Meng

Thu Dec 01 09:00 AM -- 11:00 AM (PST) @ Hall J #515
This work finds the analytical expression of the global minima of a deep linear network with weight decay and stochastic neurons, a fundamental model for understanding the landscape of neural networks. Our result implies that zero is a special point in deep neural network architecture. We show that weight decay strongly interacts with the model architecture and can create bad minima at zero in a network with more than $1$ hidden layer, qualitatively different from a network with only $1$ hidden layer. Practically, our result implies that common deep learning initialization methods are insufficient to ease the optimization of neural networks in general.

Author Information

Liu Ziyin (University of Tokyo)
Botao Li (École Normale Supérieure)
Xiangming Meng (The University of Tokyo)

More from the Same Authors