Edge-Based Communication Optimization for Distributed Federated Learning

Tian WANG, Yan LIU, Xi ZHENG, Hong Ning DAI, Weijia JIA, Mande XIE

Research output: Journal PublicationsJournal Article (refereed)peer-review

19 Citations (Scopus)


Federated learning can achieve the purpose of distributed machine learning without sharing privacy and sensitive data of end devices. However, high concurrent access to the server increases the transmission delay of model updates, and the local model may be an unnecessary model with the opposite gradient from the global model, thus incurring a large number of additional communication costs. To this end, we study a framework of edge-based communication optimization to reduce the number of end devices directly connected to the server while avoiding uploading unnecessary local updates. Specifically, we cluster devices in the same network location and deploy mobile edge nodes in different network locations to serve as hubs for cloud and end devices communications, thereby avoiding the latency associated with high server concurrency. Meanwhile, we propose a model cleaning method based on cosine similarity. If the value of similarity is less than a preset threshold, the local update will not be uploaded to the mobile edge nodes, thus avoid unnecessary communication. Experimental results show that compared with traditional federated learning, the proposed scheme reduces the number of local updates by 60%, and accelerates the convergence speed of the regression model by 10.3%.

Original languageEnglish
JournalIEEE Transactions on Network Science and Engineering
Publication statusE-pub ahead of print - 3 Jun 2021
Externally publishedYes

Bibliographical note

Publisher Copyright:


  • Clustering
  • Collaborative work
  • Communication optimization
  • Computational modeling
  • Data models
  • Data privacy
  • Federated learning
  • Mobile edge nodes
  • Model filtering
  • Optimization
  • Servers
  • Training


Dive into the research topics of 'Edge-Based Communication Optimization for Distributed Federated Learning'. Together they form a unique fingerprint.

Cite this