Skip to yearly menu bar Skip to main content


Poster

Dichotomize and Generalize: PAC-Bayesian Binary Activated Deep Neural Networks

GaĆ«l Letarte · Pascal Germain · Benjamin Guedj · Francois Laviolette

East Exhibition Hall B + C #218

Keywords: [ Theory ] [ Learning Theory ] [ Deep Learning ]


Abstract:

We present a comprehensive study of multilayer neural networks with binary activation, relying on the PAC-Bayesian theory. Our contributions are twofold: (i) we develop an end-to-end framework to train a binary activated deep neural network, (ii) we provide nonvacuous PAC-Bayesian generalization bounds for binary activated deep neural networks. Our results are obtained by minimizing the expected loss of an architecture-dependent aggregation of binary activated deep neural networks. Our analysis inherently overcomes the fact that binary activation function is non-differentiable. The performance of our approach is assessed on a thorough numerical experiment protocol on real-life datasets.

Live content is unavailable. Log in and register to view live content