Skip to yearly menu bar Skip to main content


Exponential expressivity in deep neural networks through transient chaos

Ben Poole · Subhaneil Lahiri · Maithra Raghu · Jascha Sohl-Dickstein · Surya Ganguli

Area 5+6+7+8 #95

Keywords: [ (Cognitive/Neuroscience) Theoretical Neuroscience ] [ Deep Learning or Neural Networks ]


We combine Riemannian geometry with the mean field theory of high dimensional chaos to study the nature of signal propagation in deep neural networks with random weights. Our results reveal a phase transition in the expressivity of random deep networks, with networks in the chaotic phase computing nonlinear functions whose global curvature grows exponentially with depth, but not with width. We prove that this generic class of random functions cannot be efficiently computed by any shallow network, going beyond prior work that restricts their analysis to single functions. Moreover, we formally quantify and demonstrate the long conjectured idea that deep networks can disentangle exponentially curved manifolds in input space into flat manifolds in hidden space. Our theoretical framework for analyzing the expressive power of deep networks is broadly applicable and provides a basis for quantifying previously abstract notions about the geometry of deep functions.

Live content is unavailable. Log in and register to view live content