Skip to yearly menu bar Skip to main content


Poster

The committee machine: Computational to statistical gaps in learning a two-layers neural network

Benjamin Aubin · Antoine Maillard · jean barbier · Florent Krzakala · Nicolas Macris · Lenka Zdeborová

Room 517 AB #111

Keywords: [ Statistical Physics of Learning ] [ Belief Propagation ] [ Large Deviations and Asymptotic Analysis ] [ Information Theory ]


Abstract:

Heuristic tools from statistical physics have been used in the past to compute the optimal learning and generalization errors in the teacher-student scenario in multi- layer neural networks. In this contribution, we provide a rigorous justification of these approaches for a two-layers neural network model called the committee machine. We also introduce a version of the approximate message passing (AMP) algorithm for the committee machine that allows to perform optimal learning in polynomial time for a large set of parameters. We find that there are regimes in which a low generalization error is information-theoretically achievable while the AMP algorithm fails to deliver it; strongly suggesting that no efficient algorithm exists for those cases, and unveiling a large computational gap.

Live content is unavailable. Log in and register to view live content