Skip to main navigation Skip to search Skip to main content

Fed-OGD: Mitigating Straggler Effects in Federated Learning via Orthogonal Gradient Descent

  • Wei LI
  • , Zicheng SHEN
  • , Xiulong LIU
  • , Chuntao DING
  • , Jiaxing SHEN

Research output: Journal PublicationsJournal Article (refereed)peer-review

Abstract

Federated Learning (FL) faces challenges due to straggler clients that impede timely parameter uploads, potentially leading to suboptimal global model performance. Existing approaches using synchronous and asynchronous communication suffer from long waiting times or convergence issues. We propose Fed-OGD, a novel asynchronous FL method addressing the straggler problem through gradient orthogonalization. Our approach innovatively frames the straggler issue using catastrophic forgetting theory, viewing stragglers as instances of the global model “forgetting” to aggregate their parameters. Fed-OGD introduces an Orthogonal Gradient Descent (OGD) technique that caches straggler gradients and orthogonalizes the difference between these and current active client gradients. By projecting active gradients onto straggler orthogonal bases and subtracting the resulting components, we obtain orthogonalized gradients guiding the model towards optimality. We provide theoretical convergence guarantees and demonstrate Fed-OGD’s effectiveness through extensive experiments. Our method achieves state-of-the-art performance across multiple datasets among SOTA FL baselines, with notable improvements in non-IID (non-Independent and identically distributed) scenarios: there are few main categories with many samples while other categories hold few samples in a client. Fed-OGD achieves that 16.66% increase in accuracy on CIFAR-10, and significant gains on CIFAR-100 (5.37%), Tiny-ImageNet (38.51%), and AG_NEWS (16.30%).
Original languageEnglish
Pages (from-to)3018-3031
Number of pages14
JournalIEEE Transactions on Computers
Volume74
Issue number9
Early online date30 Jun 2025
DOIs
Publication statusPublished - 2025

Bibliographical note

Publisher Copyright:
© 1968-2012 IEEE.

Funding

This work was supported in part by the National Key Research and Development Program of China under Grant 2022YFF1101100, in part by the National Natural Science Foundation of China under Grant 62202039, and in part by the financial support of Lingnan University (LU) under Grant DR25F4 and Lam Woo Research Fund at LU under Grant LWP20021.

Keywords

  • Federated learning
  • catastrophic forgetting
  • gradient orthogonalization
  • straggler issue

Fingerprint

Dive into the research topics of 'Fed-OGD: Mitigating Straggler Effects in Federated Learning via Orthogonal Gradient Descent'. Together they form a unique fingerprint.

Cite this