Timezone: »

Communication-Efficient Distributed Dual Coordinate Ascent
Martin Jaggi · Virginia Smith · Martin Takac · Jonathan Terhorst · Sanjay Krishnan · Thomas Hofmann · Michael Jordan

Thu Dec 11 11:00 AM -- 03:00 PM (PST) @ Level 2, room 210D

Communication remains the most significant bottleneck in the performance of distributed optimization algorithms for large-scale machine learning. In this paper, we propose a communication-efficient framework, COCOA, that uses local computation in a primal-dual setting to dramatically reduce the amount of necessary communication. We provide a strong convergence rate analysis for this class of algorithms, as well as experiments on real-world distributed datasets with implementations in Spark. In our experiments, we find that as compared to state-of-the-art mini-batch versions of SGD and SDCA algorithms, COCOA converges to the same .001-accurate solution quality on average 25× as quickly.

Author Information

Martin Jaggi (EPFL)
Virginia Smith (UC Berkeley)
Martin Takac (Mohamed bin Zayed University of Artificial Intelligence (MBZUAI))
Jonathan Terhorst (UC Berkeley)
Sanjay Krishnan (University of California Berkeley)
Thomas Hofmann (ETH Zurich)
Michael Jordan (UC Berkeley)

More from the Same Authors