Timezone: »

 
Poster
Adaptive Gradient Quantization for Data-Parallel SGD
Fartash Faghri · Iman Tabrizian · Ilia Markov · Dan Alistarh · Daniel Roy · Ali Ramezani-Kebrya

Wed Dec 09 09:00 AM -- 11:00 AM (PST) @ Poster Session 3 #842

Many communication-efficient variants of SGD use gradient quantization schemes. These schemes are often heuristic and fixed over the course of training. We empirically observe that the statistics of gradients of deep models change during the training. Motivated by this observation, we introduce two adaptive quantization schemes, ALQ and AMQ. In both schemes, processors update their compression schemes in parallel by efficiently computing sufficient statistics of a parametric distribution. We improve the validation accuracy by almost 2% on CIFAR-10 and 1% on ImageNet in challenging low-cost communication setups. Our adaptive methods are also significantly more robust to the choice of hyperparameters.

Author Information

Fartash Faghri (University of Toronto)
Iman Tabrizian (University of Toronto)
Ilia Markov (IST Austria)
Dan Alistarh (IST Austria & Neural Magic Inc.)
Daniel Roy (Univ of Toronto & Vector)
Ali Ramezani-Kebrya (University of Toronto and Vector Institute)

More from the Same Authors