Abstract:
We present a federated, asynchronous, and -differentially private algorithm for in the memory-limited setting.
%
Our algorithm incrementally computes local model updates using a streaming procedure and adaptively estimates its leading principal components when only memory is available with being the dimensionality of the data.
%
We guarantee differential privacy via an input-perturbation scheme in which the covariance matrix of a dataset is perturbed with a non-symmetric random Gaussian matrix with variance in , thus improving upon the state-of-the-art.
%
Furthermore, contrary to previous federated or distributed algorithms for , our algorithm is also invariant to permutations in the incoming data, which provides robustness against straggler or failed nodes.
%
Numerical simulations show that, while using limited-memory, our algorithm exhibits performance that closely matches or outperforms traditional non-federated algorithms, and in the absence of communication latency, it exhibits attractive horizontal scalability.
Chat is not available.