Skip to yearly menu bar Skip to main content


Oral

Understanding Dropout

Pierre Baldi · Peter Sadowski

Harvey's Convention Center Floor, CC

Abstract:

Dropout is a relatively new algorithm for training neural networks which relies on stochastically "dropping out'' neurons during training in order to avoid the co-adaptation of feature detectors. We introduce a general formalism for studying dropout on either units or connections, with arbitrary probability values, and use it to analyze the averaging and regularizing properties of dropout in both linear and non-linear networks. For deep neural networks, the averaging properties of dropout are characterized by three recursive equations, including the approximation of expectations by normalized weighted geometric means. We provide estimates and bounds for these approximations and corroborate the results with simulations. We also show in simple cases how dropout performs stochastic gradient descent on a regularized error function.

Chat is not available.