Poster
High-Dimensional Optimization in Adaptive Random Subspaces
Jonathan Lacotte · Mert Pilanci · Marco Pavone

Wed Dec 11th 05:00 -- 07:00 PM @ East Exhibition Hall B + C #163

We propose a new randomized optimization method for high-dimensional problems which can be seen as a generalization of coordinate descent to random subspaces. We show that an adaptive sampling strategy for the random subspace significantly outperforms the oblivious sampling method, which is the common choice in the recent literature. The adaptive subspace can be efficiently generated by a correlated random matrix ensemble whose statistics mimic the input data. We prove that the improvement in the relative error of the solution can be tightly characterized in terms of the spectrum of the data matrix, and provide probabilistic upper-bounds. We then illustrate the consequences of our theory with data matrices of different spectral decay. Extensive experimental results show that the proposed approach offers significant speed ups in machine learning problems including logistic regression, kernel classification with random convolution layers and shallow neural networks with rectified linear units. Our analysis is based on convex analysis and Fenchel duality, and establishes connections to sketching and randomized matrix decompositions.

Author Information

Jonathan Lacotte (Stanford University)
Mert Pilanci (Stanford)
Marco Pavone (Stanford University)

More from the Same Authors