Timezone: »
Poster
A Continuous-Time Mirror Descent Approach to Sparse Phase Retrieval
Fan Wu · Patrick Rebeschini
We analyze continuous-time mirror descent applied to sparse phase retrieval, which is the problem of recovering sparse signals from a set of magnitude-only measurements. We apply mirror descent to the unconstrained empirical risk minimization problem (batch setting), using the square loss and square measurements. We provide a full convergence analysis of the algorithm in this non-convex setting and prove that, with the hypentropy mirror map, mirror descent recovers any $k$-sparse vector $\mathbf{x}^\star\in\mathbb{R}^n$ with minimum (in modulus) non-zero entry on the order of $\| \mathbf{x}^\star \|_2/\sqrt{k}$ from $k^2$ Gaussian measurements, modulo logarithmic terms. This yields a simple algorithm which, unlike most existing approaches to sparse phase retrieval, adapts to the sparsity level, without including thresholding steps or adding regularization terms. Our results also provide a principled theoretical understanding for Hadamard Wirtinger flow [54], as Euclidean gradient descent applied to the empirical risk problem with Hadamard parametrization can be recovered as a first-order approximation to mirror descent in discrete time.
Author Information
Fan Wu (University of Oxford)
Patrick Rebeschini (University of Oxford)
Related Events (a corresponding poster, oral, or spotlight)
-
2020 Spotlight: A Continuous-Time Mirror Descent Approach to Sparse Phase Retrieval »
Fri. Dec 11th 04:00 -- 04:10 AM Room Orals & Spotlights: Optimization
More from the Same Authors
-
2023 Poster: A Novel Framework for Policy Mirror Descent with General Parametrization and Linear Convergence »
Carlo Alfano · Rui Yuan · Patrick Rebeschini -
2023 Poster: Optimal Convergence Rate for Exact Policy Mirror Descent in Discounted Markov Decision Processes »
Emmeran Johnson · Ciara Pike-Burke · Patrick Rebeschini -
2021 Poster: Implicit Regularization in Matrix Sensing via Mirror Descent »
Fan Wu · Patrick Rebeschini -
2021 Poster: Distributed Machine Learning with Sparse Heterogeneous Data »
Dominic Richards · Sahand Negahban · Patrick Rebeschini -
2021 Poster: Time-independent Generalization Bounds for SGLD in Non-convex Settings »
Tyler Farghly · Patrick Rebeschini -
2021 Poster: On Optimal Interpolation in Linear Regression »
Eduard Oravkin · Patrick Rebeschini -
2020 Poster: The Statistical Complexity of Early-Stopped Mirror Descent »
Tomas Vaskevicius · Varun Kanade · Patrick Rebeschini -
2020 Spotlight: The Statistical Complexity of Early-Stopped Mirror Descent »
Tomas Vaskevicius · Varun Kanade · Patrick Rebeschini -
2019 Poster: Implicit Regularization for Optimal Sparse Recovery »
Tomas Vaskevicius · Varun Kanade · Patrick Rebeschini -
2019 Poster: Optimal Statistical Rates for Decentralised Non-Parametric Regression with Linear Speed-Up »
Dominic Richards · Patrick Rebeschini -
2019 Poster: Decentralized Cooperative Stochastic Bandits »
David MartÃnez-Rubio · Varun Kanade · Patrick Rebeschini -
2017 Poster: Accelerated consensus via Min-Sum Splitting »
Patrick Rebeschini · Sekhar C Tatikonda