Timezone: »

 
Spotlight
Straggler Mitigation in Distributed Optimization Through Data Encoding
Can Karakus · Yifan Sun · Suhas Diggavi · Wotao Yin

Wed Dec 06 03:45 PM -- 03:50 PM (PST) @ Hall C

Slow running or straggler tasks can significantly reduce computation speed in distributed computation. Recently, coding-theory-inspired approaches have been applied to mitigate the effect of straggling, through embedding redundancy in certain linear computational steps of the optimization algorithm, thus completing the computation without waiting for the stragglers. In this paper, we propose an alternate approach where we embed the redundancy in the data instead of the computation, and allow the nodes to operate completely oblivious to encoding. We propose several encoding schemes, and demonstrate that popular batch algorithms, such as gradient descent and L-BFGS, applied in a coding-oblivious manner, deterministically achieve sample path linear convergence to an approximate solution of the original problem, using an arbitrarily varying subset of the nodes at each iteration. Moreover, this approximation can be controlled by the choice of the encoding matrix and the number of nodes used in each iteration. We provide experimental results demonstrating the advantage of the approach over uncoded and replication strategies.

Author Information

Can Karakus (UCLA)
Yifan Sun
Suhas Diggavi (UCLA)
Wotao Yin (University of California, Los Angeles)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors