Timezone: »
Motivated by distributed machine learning settings such as Federated Learning, we consider the problem of fitting a statistical model across a distributed collection of heterogeneous data sets whose similarity structure is encoded by a graph topology. Precisely, we analyse the case where each node is associated with fitting a sparse linear model, and edges join two nodes if the difference of their solutions is also sparse. We propose a method based on Basis Pursuit Denoising with a total variation penalty, and provide finite sample guarantees for sub-Gaussian design matrices. Taking the root of the tree as a reference node, we show that if the sparsity of the differences across nodes is smaller than the sparsity at the root, then recovery is successful with fewer samples than by solving the problems independently, or by using methods that rely on a large overlap in the signal supports, such as the group Lasso. We consider both the noiseless and noisy setting, and numerically investigate the performance of distributed methods based on Distributed Alternating Direction Methods of Multipliers (ADMM) and hyperspectral unmixing.
Author Information
Dominic Richards (University of Oxford)
Sahand Negahban (Yale University)
Patrick Rebeschini (University of Oxford)
More from the Same Authors
-
2021 : Exploiting 3D Shape Bias towards Robust Vision »
Yutaro Yamada · Yuval Kluger · Sahand Negahban · Ilker Yildirim -
2021 Poster: Implicit Regularization in Matrix Sensing via Mirror Descent »
Fan Wu · Patrick Rebeschini -
2021 Poster: Time-independent Generalization Bounds for SGLD in Non-convex Settings »
Tyler Farghly · Patrick Rebeschini -
2021 Poster: Stability & Generalisation of Gradient Descent for Shallow Neural Networks without the Neural Tangent Kernel »
Dominic Richards · Ilja Kuzborskij -
2021 Poster: On Optimal Interpolation in Linear Regression »
Eduard Oravkin · Patrick Rebeschini -
2020 Poster: A Continuous-Time Mirror Descent Approach to Sparse Phase Retrieval »
Fan Wu · Patrick Rebeschini -
2020 Spotlight: A Continuous-Time Mirror Descent Approach to Sparse Phase Retrieval »
Fan Wu · Patrick Rebeschini -
2020 Poster: The Statistical Complexity of Early-Stopped Mirror Descent »
Tomas Vaskevicius · Varun Kanade · Patrick Rebeschini -
2020 Spotlight: The Statistical Complexity of Early-Stopped Mirror Descent »
Tomas Vaskevicius · Varun Kanade · Patrick Rebeschini -
2019 Poster: Implicit Regularization for Optimal Sparse Recovery »
Tomas Vaskevicius · Varun Kanade · Patrick Rebeschini -
2019 Poster: Optimal Statistical Rates for Decentralised Non-Parametric Regression with Linear Speed-Up »
Dominic Richards · Patrick Rebeschini -
2019 Poster: Decentralized Cooperative Stochastic Bandits »
David MartÃnez-Rubio · Varun Kanade · Patrick Rebeschini -
2017 Poster: Minimax Estimation of Bandable Precision Matrices »
Addison Hu · Sahand Negahban -
2017 Poster: Accelerated consensus via Min-Sum Splitting »
Patrick Rebeschini · Sekhar C Tatikonda