Timezone: »
Many high-dimensional statistical inference problems are believed to possess inherent computational hardness. Various frameworks have been proposed to give rigorous evidence for such hardness, including lower bounds against restricted models of computation (such as low-degree functions), as well as methods rooted in statistical physics that are based on free energy landscapes. This paper aims to make a rigorous connection between the seemingly different low-degree and free-energy based approaches. We define a free-energy based criterion for hardness and formally connect it to the well-established notion of low-degree hardness for a broad class of statistical problems, namely all Gaussian additive models and certain models with a sparse planted signal. By leveraging these rigorous connections we are able to: establish that for Gaussian additive models the "algebraic" notion of low-degree hardness implies failure of "geometric" local MCMC algorithms, and provide new low-degree lower bounds for sparse linear regression which seem difficult to prove directly. These results provide both conceptual insights into the connections between different notions of hardness, as well as concrete technical tools such as new methods for proving low-degree lower bounds.
Author Information
Afonso S Bandeira (ETH Zurich)
Ahmed El Alaoui (Stanford University)
Samuel Hopkins (Massachusetts Institute of Technology)
Tselil Schramm (Stanford University)
Alexander S Wein (University of California, Davis)
Ilias Zadik (MIT)
I am a CDS Moore-Sloan (postdoctoral) fellow at the Center for Data Science of NYU and a member of it's Math and Data (MaD) group. I received my PhD on September 2019 from MIT , where I was advised by David Gamarnik. My research lies broadly in the interface of high dimensional statistics, the theory of machine learning and applied probability.
More from the Same Authors
-
2022 Panel: Panel 3C-1: The Franz-Parisi Criterion… & List-Decodable Sparse Mean… »
Afonso S Bandeira · Sushrut Karmalkar -
2022 Poster: Archimedes Meets Privacy: On Privately Estimating Quantiles in High Dimensions Under Minimal Assumptions »
Omri Ben-Eliezer · Dan Mikulincer · Ilias Zadik -
2022 Poster: Privacy Induces Robustness: Information-Computation Gaps and Sparse Mean Estimation »
Kristian Georgiev · Samuel Hopkins -
2021 Poster: On the Cryptographic Hardness of Learning Single Periodic Neurons »
Min Jae Song · Ilias Zadik · Joan Bruna -
2021 Poster: Robust Regression Revisited: Acceleration and Improved Estimation Rates »
Arun Jambulapati · Jerry Li · Tselil Schramm · Kevin Tian -
2018 Poster: High Dimensional Linear Regression using Lattice Basis Reduction »
Ilias Zadik · David Gamarnik -
2017 : Poster session »
Abbas Zaidi · Christoph Kurz · David Heckerman · YiJyun Lin · Stefan Riezler · Ilya Shpitser · Songbai Yan · Olivier Goudet · Yash Deshpande · Judea Pearl · Jovana Mitrovic · Brian Vegetabile · Tae Hwy Lee · Karen Sachs · Karthika Mohan · Reagan Rose · Julius Ramakers · Negar Hassanpour · Pierre Baldi · Razieh Nabi · Noah Hammarlund · Eli Sherman · Carolin Lawrence · Fattaneh Jabbari · Vira Semenova · Maria Dimakopoulou · Pratik Gajane · Russell Greiner · Ilias Zadik · Alexander Blocker · Hao Xu · Tal EL HAY · Tony Jebara · Benoit Rostykus