`

Timezone: »

 
Poster
Precise characterization of the prior predictive distribution of deep ReLU networks
Lorenzo Noci · Gregor Bachmann · Kevin Roth · Sebastian Nowozin · Thomas Hofmann

Thu Dec 09 12:30 AM -- 02:00 AM (PST) @ None #None

Recent works on Bayesian neural networks (BNNs) have highlighted the need to better understand the implications of using Gaussian priors in combination with the compositional structure of the network architecture. Similar in spirit to the kind of analysis that has been developed to devise better initialization schemes for neural networks (cf. He- or Xavier initialization), we derive a precise characterization of the prior predictive distribution of finite-width ReLU networks with Gaussian weights.While theoretical results have been obtained for their heavy-tailedness,the full characterization of the prior predictive distribution (i.e. its density, CDF and moments), remained unknown prior to this work. Our analysis, based on the Meijer-G function, allows us to quantify the influence of architectural choices such as the width or depth of the network on the resulting shape of the prior predictive distribution. We also formally connect our results to previous work in the infinite width setting, demonstrating that the moments of the distribution converge to those of a normal log-normal mixture in the infinite depth limit. Finally, our results provide valuable guidance on prior design: for instance, controlling the predictive variance with depth- and width-informed priors on the weights of the network.

Author Information

Lorenzo Noci (Swiss Federal Institute of Technology)
Gregor Bachmann (ETH Zürich)
Kevin Roth (ETH Zurich)
Sebastian Nowozin (Microsoft Research Cambridge)
Thomas Hofmann (ETH Zurich)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors