Skip to yearly menu bar Skip to main content


Poster

Cone-Constrained Principal Component Analysis

Yash Deshpande · Andrea Montanari · Emile Richard

Level 2, room 210D

Abstract: Estimating a vector from noisy quadratic observations is a task that arises naturally in many contexts, from dimensionality reduction, to synchronization and phase retrieval problems. It is often the case that additional information is available about the unknown vector (for instance, sparsity, sign or magnitude of its entries). Many authors propose non-convex quadratic optimization problems that aim at exploiting optimally this information. However, solving these problems is typically NP-hard. We consider a simple model for noisy quadratic observation of an unknown vector $\bvz$. The unknown vector is constrained to belong to a cone $\Cone \ni \bvz$. While optimal estimation appears to be intractable for the general problems in this class, we provide evidence that it is tractable when $\Cone$ is a convex cone with an efficient projection. This is surprising, since the corresponding optimization problem is non-convex and --from a worst case perspective-- often NP hard. We characterize the resulting minimax risk in terms of the statistical dimension of the cone $\delta(\Cone)$. This quantity is already known to control the risk of estimation from gaussian observations and random linear measurements. It is rather surprising that the same quantity plays a role in the estimation risk from quadratic measurements.

Live content is unavailable. Log in and register to view live content