Timezone: »

Proximal Newton-type Methods for Minimizing Convex Objective Functions in Composite Form
Jason D Lee · Yuekai Sun · Michael Saunders

Thu Dec 06 02:00 PM -- 12:00 AM (PST) @ Harrah’s Special Events Center 2nd Floor #None
We consider minimizing convex objective functions in \emph{composite form} \begin{align*} \minimize_{x\in\R^n} f(x) := g(x) + h(x), \end{align*} where $g$ is convex and twice-continuously differentiable and $h:\R^n\to\R$ is a convex but not necessarily differentiable function whose proximal mapping can be evaluated efficiently. We derive a generalization of Newton-type methods to handle such convex but nonsmooth objective functions. Many problems of relevance in high-dimensional statistics, machine learning, and signal processing can be formulated in composite form. We prove such methods are globally convergent to a minimizer and achieve quadratic rates of convergence in the vicinity of a unique minimizer. We also demonstrate the performance of such methods using problems of relevance in machine learning and high-dimensional statistics.

Author Information

Jason D Lee (Stanford)
Yuekai Sun (Stanford University)
Michael Saunders (Stanford University)

More from the Same Authors