Timezone: »

Communication-Efficient Topologies for Decentralized Learning with $O(1)$ Consensus Rate
Zhuoqing Song · Weijian Li · Kexin Jin · Lei Shi · Ming Yan · Wotao Yin · Kun Yuan

Wed Nov 30 02:00 PM -- 04:00 PM (PST) @ Hall J #804
Decentralized optimization is an emerging paradigm in distributed learning in which agents achieve network-wide solutions by peer-to-peer communication without the central server. Since communication tends to be slower than computation, when each agent communicates with only a few neighboring agents per iteration, they can complete iterations faster than with more agents or a central server. However, the total number of iterations to reach a network-wide solution is affected by the speed at which the information of the agents is ``mixed'' by communication. We found that popular communication topologies either have large degrees (such as stars and complete graphs) or are ineffective at mixing information (such as rings and grids). To address this problem, we propose a new family of topologies, EquiTopo, which has an (almost) constant degree and network-size-independent consensus rate which is used to measure the mixing efficiency.In the proposed family, EquiStatic has a degree of $\Theta(\ln(n))$, where $n$ is the network size, and a series of time-varying one-peer topologies, EquiDyn, has a constant degree of 1. We generate EquiDyn through a certain random sampling procedure. Both of them achieve $n$-independent consensus rate. We apply them to decentralized SGD and decentralized gradient tracking and obtain faster communication and better convergence, both theoretically and empirically. Our code is implemented through BlueFog and available at https://github.com/kexinjinnn/EquiTopo.

Author Information

Zhuoqing Song (Fudan University)
Weijian Li (Damo Academy, Alibaba Group)
Kexin Jin (Princeton University)
Lei Shi (Fudan University)
Ming Yan (The Chinese University of Hong Kong, Shenzhen)
Wotao Yin (Alibaba Group US)
Kun Yuan (Peking University)

More from the Same Authors