Distributed Semi-Supervised Learning with Consensus Consistency on Edge Devices

Hao-Rui CHEN, Lei YANG, Xinglin ZHANG, Jiaxing SHEN, Jiannong CAO

Research output: Journal PublicationsJournal Article (refereed)peer-review

2 Citations (Scopus)

Abstract

Distributed learning has been increasingly studied in edge computing, enabling edge devices to learn a model collaboratively without exchanging their private data. However, existing approaches assume the private data owned by edge devices are all labeled while the reality is that massive private data are unlabeled and remain to be utilized, which leads to suboptimal performance. To overcome this limitation, we study a new practical problem, Distributed Semi-Supervised Learning (DSSL), to learn models collaboratively with mixed private labeled and unlabeled data on each device. We also propose a novel method DistMatch that exploits private unlabeled data by self-training on each device with the help of models from neighboring devices. DistMatch generates pseudo-labels for unlabeled data by properly averaging the predictions of these received models. Furthermore, to avoid self-training with wrong pseudo-labels, DistMatch proposes a consensus consistency loss to filter pseudo-labels with high consensus and force the output of the trained model to be consistent with these pseudo-labels. Extensive evaluation results via our self-developed testbed indicate the proposed method outperforms all baselines on commonly used image classification benchmark datasets.
Original languageEnglish
Pages (from-to)310-323
Number of pages14
JournalIEEE Transactions on Parallel and Distributed Systems
Volume35
Issue number2
Early online date8 Dec 2023
DOIs
Publication statusPublished - Feb 2024

Bibliographical note

Publisher Copyright:
© 1990-2012 IEEE.

Funding

This work was supported in part by the National Natural Science Foundation of China under Grants 61972161 and 62372185, in part by Guangdong Basic and Applied Basic Research Foundation under Grant 2020A1515011496, in part by theGuangzhou Basic and Applied BasicResearch Foundation under Grant 202201010715, in part by Hong Kong RGC General Research Fund under Grants PolyU 15217919 and PolyU 152133/18, in part by Hong Kong RGC Theme-based Research Scheme (TRS) under Grant T43- 513/23-N, in part by HK RGC Research Impact Fund under Grant R5060-19, and in part by Lingnan University, Hong Kong Special Administrative Region, China. Recommended for acceptance by S. Wang.

Keywords

  • Consistency regularization
  • distributed machine learning
  • semi-supervised learning

Fingerprint

Dive into the research topics of 'Distributed Semi-Supervised Learning with Consensus Consistency on Edge Devices'. Together they form a unique fingerprint.

Cite this