Manifolds, sparsity, and structured models: When can low-dimensional geometry really help?
Richard Baraniuk · Volkan Cevher · Mark A Davenport · Piotr Indyk · Bruno Olshausen · Michael B Wakin

Sat Dec 12th 07:30 AM -- 06:30 PM @ Hilton: Mt. Currie South
Event URL: »

Manifolds, sparsity, and other low-dimensional geometric models have long been studied and exploited in machine learning, signal processing and computer science. For instance, manifold models lie at the heart of a variety of nonlinear dimensionality reduction techniques. Similarly, sparsity has made an impact in the problems of compression, linear regression, subset selection, graphical model learning, and compressive sensing. Moreover, often motivated by evidence that various neural systems are performing sparse coding, sparse representations have been exploited as an efficient and robust method for encoding a variety of natural signals. In all of these cases the key idea is to exploit low-dimensional models to obtain compact representations of the data. The goal of this workshop is to find commonalities and forge connections between these different fields and to examine how we can we exploit low-dimensional geometric models to help solve common problems in machine learning and beyond.

Author Information

Richard Baraniuk (Rice University)
Volkan Cevher (EPFL)
Mark A Davenport (Rice University)
Piotr Indyk (Massachusetts Institute of Technology)
Bruno Olshausen (Redwood Center/UC Berkeley)
Michael B Wakin (Colorado School of Mines)

More from the Same Authors