Timezone: »

Adaptive Averaging in Accelerated Descent Dynamics
Walid Krichene · Alexandre Bayen · Peter Bartlett

Wed Dec 07 09:00 AM -- 12:30 PM (PST) @ Area 5+6+7+8 #114
We study accelerated descent dynamics for constrained convex optimization. This dynamics can be described naturally as a coupling of a dual variable accumulating gradients at a given rate $\eta(t)$, and a primal variable obtained as the weighted average of the mirrored dual trajectory, with weights $w(t)$. Using a Lyapunov argument, we give sufficient conditions on $\eta$ and $w$ to achieve a desired convergence rate. As an example, we show that the replicator dynamics (an example of mirror descent on the simplex) can be accelerated using a simple averaging scheme. We then propose an adaptive averaging heuristic which adaptively computes the weights to speed up the decrease of the Lyapunov function. We provide guarantees on adaptive averaging in continuous-time, prove that it preserves the quadratic convergence rate of accelerated first-order methods in discrete-time, and give numerical experiments to compare it with existing heuristics, such as adaptive restarting. The experiments indicate that adaptive averaging performs at least as well as adaptive restarting, with significant improvements in some cases.

Author Information

Walid Krichene (UC Berkeley)
Alexandre Bayen (UC Berkeley)
Peter Bartlett (UC Berkeley)

More from the Same Authors