Communication-efficient distributionally robust decentralized learning

Zecchin, Matteo; Kountouris, Marios; Gesbert, David
Transactions on Machine Learning Research Journal, December 2022

Decentralized learning algorithms empower interconnected edge devices to share data and computational resources to collaboratively train a machine learning model without the aid of a central coordinator (e.g. an orchestrating basestation). In the case of heterogeneous data distributions at the network devices, collaboration can yield predictors with unsatisfactory performance for a subset of the devices. For this reason, in this work we consider the formulation of a distributionally robust decentralized learning task and we propose a decentralized single loop gradient descent/ascent algorithm (AD-GDA) to solve the underlying minimax optimization problem. We render our algorithm communication efficient by employing a compressed consensus scheme and we provide convergence guarantees for smooth convex and non-convex loss functions. Finally, we corroborate the theoretical findings with empirical evidence of the ability of the proposed algorithm in providing unbiased predictors over a network of collaborating devices with highly heterogeneous data distributions. 


HAL
Type:
Journal
Date:
2022-05-31
Department:
Systèmes de Communication
Eurecom Ref:
6920
Copyright:
© EURECOM. Personal use of this material is permitted. The definitive version of this paper was published in Transactions on Machine Learning Research Journal, December 2022 and is available at :

PERMALINK : https://www.eurecom.fr/publication/6920