An Improved Algorithm for Clustered Federated Learning
From MaRDI portal
Publication:6414606
arXiv2210.11538MaRDI QIDQ6414606
Author name not available (Why is that?)
Publication date: 20 October 2022
Abstract: In this paper, we address the dichotomy between heterogeneous models and simultaneous training in Federated Learning (FL) via a clustering framework. We define a new clustering model for FL based on the (optimal) local models of the users: two users belong to the same cluster if their local models are close; otherwise they belong to different clusters. A standard algorithm for clustered FL is proposed in cite{ghosh_efficient_2021}, called exttt{IFCA}, which requires emph{suitable} initialization and the knowledge of hyper-parameters like the number of clusters (which is often quite difficult to obtain in practical applications) to converge. We propose an improved algorithm, emph{Successive Refine Federated Clustering Algorithm} ( exttt{SR-FCA}), which removes such restrictive assumptions. exttt{SR-FCA} treats each user as a singleton cluster as an initialization, and then successively refine the cluster estimation via exploiting similar users belonging to the same cluster. In any intermediate step, exttt{SR-FCA} uses a robust federated learning algorithm within each cluster to exploit simultaneous training and to correct clustering errors. Furthermore, exttt{SR-FCA} does not require any emph{good} initialization (warm start), both in theory and practice. We show that with proper choice of learning rate, exttt{SR-FCA} incurs arbitrarily small clustering error. Additionally, we validate the performance of our algorithm on standard FL datasets in non-convex problems like neural nets, and we show the benefits of exttt{SR-FCA} over baselines.
Has companion code repository: https://github.com/harshv834/sr-fca
This page was built for publication: An Improved Algorithm for Clustered Federated Learning
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6414606)