Convex optimization based on global lower second-order models
Nikita Doikov, Yurii Nesterov
Oral presentation: Orals & Spotlights Track 21: Optimization
on Wed, Dec 9th, 2020 @ 14:45 – 15:00 GMT
on Wed, Dec 9th, 2020 @ 14:45 – 15:00 GMT
Poster Session 4 (more posters)
on Wed, Dec 9th, 2020 @ 17:00 – 19:00 GMT
GatherTown: Optimization ( Town E3 - Spot A0 )
on Wed, Dec 9th, 2020 @ 17:00 – 19:00 GMT
GatherTown: Optimization ( Town E3 - Spot A0 )
Join GatherTown
Only iff poster is crowded, join Zoom . Authors have to start the Zoom call from their Profile page / Presentation History.
Only iff poster is crowded, join Zoom . Authors have to start the Zoom call from their Profile page / Presentation History.
Toggle Abstract Paper (in Proceedings / .pdf)
Abstract: In this work, we present new second-order algorithms for composite convex optimization, called Contracting-domain Newton methods. These algorithms are affine-invariant and based on global second-order lower approximation for the smooth component of the objective. Our approach has an interpretation both as a second-order generalization of the conditional gradient method, or as a variant of trust-region scheme. Under the assumption, that the problem domain is bounded, we prove $O(1/k^2)$ global rate of convergence in functional residual, where $k$ is the iteration counter, minimizing convex functions with Lipschitz continuous Hessian. This significantly improves the previously known bound $O(1/k)$ for this type of algorithms. Additionally, we propose a stochastic extension of our method, and present computational results for solving empirical risk minimization problem.