Fisher Information-based Efficient Curriculum Federated Learning with Large Language Models - LIRMM - Laboratoire d’Informatique, de Robotique et de Microélectronique de Montpellier
Communication Dans Un Congrès Année : 2024

Fisher Information-based Efficient Curriculum Federated Learning with Large Language Models

Ji Liu
  • Fonction : Auteur
  • PersonId : 1426848
Jiaxiang Ren
  • Fonction : Auteur
Ruoming Jin
Yang Zhou
Dejing Dou

Résumé

As a promising paradigm to collaboratively train models with decentralized data, Federated Learning (FL) can be exploited to fine-tune Large Language Models (LLMs). While LLMs correspond to huge size, the scale of the training data significantly increases, which leads to tremendous amounts of computation and communication costs. The training data is generally non-Independent and Identically Distributed (non-IID), which requires adaptive data pro- cessing within each device. Although Low-Rank Adaptation (LoRA) can significantly reduce the scale of parameters to update in the fine-tuning process, it still takes unaffordable time to transfer the low-rank parameters of all the layers in LLMs. In this paper, we propose a Fisher Information-based Efficient Curriculum Federated Learning framework (FibecFed) with two novel methods, i.e., adaptive federated curriculum learning and efficient sparse parameter update. First, we propose a fisher information- based method to adaptively sample data within each device to improve the effectiveness of the FL fine-tuning process. Second, we dynamically select the proper layers for global aggregation and sparse parameters for local update with LoRA so as to improve the efficiency of the FL fine-tuning process. Extensive experimental results based on 10 datasets demonstrate that FibecFed yields excellent performance (up to 45.35% in terms of accuracy) and superb fine-tuning speed (up to 98.61% faster) com- pared with 17 baseline approaches). Our code will be publicly available.
Fichier principal
Vignette du fichier
EMNLP2024.pdf (916.57 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)
Licence

Dates et versions

lirmm-04734309 , version 1 (13-10-2024)

Licence

Identifiants

  • HAL Id : lirmm-04734309 , version 1

Citer

Ji Liu, Jiaxiang Ren, Ruoming Jin, Zijie Zhang, Yang Zhou, et al.. Fisher Information-based Efficient Curriculum Federated Learning with Large Language Models. EMNLP 2024 - Conference on Empirical Methods in Natural Language Processing, ACL SIGDAT, Nov 2024, Miami, Fl, United States. pp.1-27. ⟨lirmm-04734309⟩
42 Consultations
23 Téléchargements

Partager

More