Article Dans Une Revue ACM Transactions on Knowledge Discovery from Data (TKDD) Année : 2025

Efficient Federated Learning with Heterogeneous Data and Adaptive Dropout

Résumé

Federated Learning (FL) is a promising distributed machine learning approach that enables collaborative training of a global model using multiple edge devices. The data distributed among the edge devices is highly heterogeneous. Thus, FL faces the challenge of data distribution and heterogeneity, where non-Independent and Identically Distributed (non-IID) data across edge devices may yield in significant accuracy drop. Furthermore, the limited computation and communication capabilities of edge devices increase the likelihood of stragglers, thus leading to slow model convergence. In this paper, we propose the FedDHAD FL framework, which comes with two novel methods: Dynamic Heterogeneous model aggregation (FedDH) and Adaptive Dropout (FedAD). FedDH dynamically adjusts the weights of each local model within the model aggregation process based on the non-IID degree of heterogeneous data to deal with the statistical data heterogeneity. FedAD performs neuron-adaptive operations in response to heterogeneous devices to improve accuracy while achieving superb efficiency. The combination of these two methods makes FedDHAD significantly outperform state-of-the-art solutions in terms of accuracy (up to 6.7% higher), efficiency (up to 2.02 times faster), and computation cost (up to 15.0% smaller).

Fichier principal
Vignette du fichier
manuscript.pdf (1.69 Mo) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)
Licence

Dates et versions

lirmm-05161239 , version 1 (14-07-2025)
lirmm-05161239 , version 2 (15-07-2025)

Licence

Identifiants

Citer

Ji Liu, Beichen Ma, Qiaolin Yu, Yang Zhou, Jingbo Zhou, et al.. Efficient Federated Learning with Heterogeneous Data and Adaptive Dropout. ACM Transactions on Knowledge Discovery from Data (TKDD), 2025, 19 (8), pp.1-31. ⟨10.1145/3749376⟩. ⟨lirmm-05161239v2⟩
191 Consultations
256 Téléchargements

Altmetric

Partager

  • More