D3FU : Data-Free Distillation Driven Federated Unlearning for Service-Oriented Computing

  • Xiuyi ZHANG
  • , Xuejun LI*
  • , Aiting YAO
  • , Jia XU
  • , Chengzu DONG
  • , Frank JIANG
  • , Xiao LIU
  • , Yun YANG
  • *Corresponding author for this work

Research output: Book Chapters | Papers in Conference ProceedingsConference paper (refereed)Researchpeer-review

Abstract

Edge Computing (EC) enables deep neural network training on distributed data, yet it raises significant privacy concerns, particularly under regulations enforcing the “right to be forgotten”. Federated Unlearning (FU) offers a solution by allowing targeted data unlearning without the need for retraining. In Service-Oriented Computing (SOC) systems, where services are composed dynamically and data flows across multiple decentralized nodes, deploying FU introduces additional challenges. Specifically, the lack of direct access to raw data within loosely coupled services, along with the high communication cost required for coordination among distributed components, significantly hinders effective unlearning. Therefore, we propose D3FU, an efficient service-compatible framework that leverages data-free knowledge distillation to achieve self-contained FU. This framework employs local unlearning through Projected Gradient Descent (PGD), which may initially degrade model performance. To mitigate the resulting bias, we integrate Model-Agnostic Meta-Learning (MAML) techniques to generate task-relevant pseudo-samples, thereby enabling data-free distillation and correcting the gradient updates of the local unlearned model. This process effectively restores model performance while ensuring accurate unlearning. Our experimental results, including evaluations of backdoor attacks, demonstrate that D3FU achieves unlearning effects comparable to retraining from scratch, with a maximum reduction in communication cost by up to 32 times.

Original languageEnglish
Title of host publicationService-Oriented Computing - 23rd International Conference, ICSOC 2025, Proceedings
EditorsMarco Aiello, Ilche Georgievski, Shuiguang Deng, Juan-Manuel Murillo, Boualem Benatallah, Zhongjie Wang
PublisherSpringer Science and Business Media Deutschland GmbH
Pages381-395
Number of pages15
ISBN (Print)9789819550111
DOIs
Publication statusE-pub ahead of print - 2 Jan 2026
Event23rd International Conference on Service-Oriented Computing, ICSOC 2025 - Shenzhen, China
Duration: 1 Dec 20254 Dec 2025

Publication series

NameLecture Notes in Computer Science
Volume16320 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference23rd International Conference on Service-Oriented Computing, ICSOC 2025
Country/TerritoryChina
CityShenzhen
Period1/12/254/12/25

Bibliographical note

Publisher Copyright:
© The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2026.

Funding

This work was supported by the National Natural Science Foundation of China Project (No. 62372004).

Keywords

  • Communication Efficiency
  • Data-Free Knowledge Distillation
  • Edge Computing
  • Federated Unlearning
  • Service Computing

Fingerprint

Dive into the research topics of 'D3FU : Data-Free Distillation Driven Federated Unlearning for Service-Oriented Computing'. Together they form a unique fingerprint.

Cite this