Timezone: »
The stochastic block model (SBM) has recently gathered significant attention due to new threshold phenomena. However, most developments rely on the knowledge of the model parameters, or at least on the number of communities. This paper introduces efficient algorithms that do not require such knowledge and yet achieve the optimal information-theoretic tradeoffs identified in Abbe-Sandon '15. In the constant degree regime, an algorithm is developed that requires only a lower-bound on the relative sizes of the communities and achieves the optimal accuracy scaling for large degrees. This lower-bound requirement is removed for the regime of diverging degrees. For the logarithmic degree regime, this is further enhanced into a fully agnostic algorithm that simultaneously learns the model parameters, achieves the optimal CH-limit for exact recovery, and runs in quasi-linear time. These provide the first algorithms affording efficiency, universality and information-theoretic optimality for strong and weak consistency in the SBM.
Author Information
Emmanuel Abbe (Princeton University)
Colin Sandon (Princeton University)
More from the Same Authors
-
2020 Poster: On the universality of deep learning »
Emmanuel Abbe · Colin Sandon -
2018 Poster: Chaining Mutual Information and Tightening Generalization Bounds »
Amir Asadi · Emmanuel Abbe · Sergio Verdu -
2017 Poster: Nonbacktracking Bounds on the Influence in Independent Cascade Models »
Emmanuel Abbe · Sanjeev Kulkarni · Eun Jee Lee -
2016 Poster: Achieving the KS threshold in the general stochastic block model with linearized acyclic belief propagation »
Emmanuel Abbe · Colin Sandon -
2016 Oral: Achieving the KS threshold in the general stochastic block model with linearized acyclic belief propagation »
Emmanuel Abbe · Colin Sandon