Skip to yearly menu bar Skip to main content


Poster

Taming the Wild: A Unified Analysis of Hogwild-Style Algorithms

Christopher M De Sa · Ce Zhang · Kunle Olukotun · Christopher Ré · Christopher Ré

210 C #85

Abstract:

Stochastic gradient descent (SGD) is a ubiquitous algorithm for a variety of machine learning problems. Researchers and industry have developed several techniques to optimize SGD's runtime performance, including asynchronous execution and reduced precision. Our main result is a martingale-based analysis that enables us to capture the rich noise models that may arise from such techniques. Specifically, we useour new analysis in three ways: (1) we derive convergence rates for the convex case (Hogwild) with relaxed assumptions on the sparsity of the problem; (2) we analyze asynchronous SGD algorithms for non-convex matrix problems including matrix completion; and (3) we design and analyze an asynchronous SGD algorithm, called Buckwild, that uses lower-precision arithmetic. We show experimentally that our algorithms run efficiently for a variety of problems on modern hardware.

Live content is unavailable. Log in and register to view live content