Replay Without Saving: Prototype Derivation and Distribution Rebalance for Class-Incremental Semantic Segmentation

Jinpeng CHEN, Runmin CONG, Yuxuan LUO, Horace Ho Shing IP, Sam KWONG*

*Corresponding author for this work

Research output: Journal PublicationsJournal Article (refereed)peer-review

Abstract

The research of class-incremental semantic segmentation (CISS) seeks to enhance semantic segmentation methods by enabling the progressive learning of new classes while preserving knowledge of previously learned ones. A significant yet often neglected challenge in this domain is class imbalance. In CISS, each task focuses on different foreground classes, with the training set for each task exclusively comprising images that contain these currently focused classes. This results in an overrepresentation of these classes within the single-task training set, leading to a classification bias towards them. To address this issue, we propose a novel CISS method named STAR, whose core principle is to reintegrate the missing proportions of previous classes into current single-task training samples by replaying their prototypes. Moreover, we develop a prototype deviation technique that enables the deduction of past-class prototypes, integrating the recognition patterns of the classifiers and the extraction patterns of the feature extractor. With this technique, replay can be accomplished without using any storage to save prototypes. Complementing our method, we devise two loss functions to enforce cross-task feature constraints: the Old-Class Features Maintaining (OCFM) loss and the Similarity-Aware Discriminative (SAD) loss. The OCFM loss is designed to stabilize the feature space of old classes, thus preserving previously acquired knowledge without compromising the ability to learn new classes. The SAD loss aims to enhance feature distinctions between similar old and new class pairs, minimizing potential confusion. Our experiments on two public datasets, Pascal VOC 2012 and ADE20 K, demonstrate that our STAR achieves state-of-the-art performance.

Original languageEnglish
JournalIEEE Transactions on Pattern Analysis and Machine Intelligence
Early online date25 Feb 2025
DOIs
Publication statusE-pub ahead of print - 25 Feb 2025

Bibliographical note

Portions of this work were previously presented at NeurIPS, 2023 under the title “Saving 100x Storage: Prototype Replay for Reconstructing Training Sample Distribution in Class-Incremental Semantic Segmentation”.

Publisher Copyright: © 1979-2012 IEEE.

Keywords

  • continual learning
  • distribution rebalance
  • prototype replay
  • Class-Incremental semantic segmentation

Fingerprint

Dive into the research topics of 'Replay Without Saving: Prototype Derivation and Distribution Rebalance for Class-Incremental Semantic Segmentation'. Together they form a unique fingerprint.

Cite this