Improving text classification via a soft dynamical label strategy

Jingjing WANG, Haoran XIE, Fu Lee WANG*, Lap-Kei LEE

*Corresponding author for this work

Research output: Journal PublicationsJournal Article (refereed)peer-review

1 Citation (Scopus)


Labels play a central role in the text classification tasks. However, most studies has a lossy label encoding problem, in which the label will be represented by a meaningless and independent one-hot vector. This paper proposes a novel strategy to dynamically generate a soft pseudo label based on the prediction for each training. This history-based soft pseudo label will be taken as the target to optimize parameters by minimizing the distance between the target and the prediction. In addition, we augment the training data with Mix-up, a widely used method, to prevent overfitting on the small dataset. Extensive experimental results demonstrate that the proposed dynamical soft label strategy significantly improves the performance of several widely used deep learning classification models on binary and multi-class text classification tasks. Not only is our simple and efficient strategy much easier to implement and train, it is also exhibits substantial improvements (up to 2.54% relative improvement on FDCNews datasets with an LSTM encoder) over Label Confusion Learning (LCM)—a state-of-the-art label smoothing model—under the same experimental setting. The experimental result also demonstrate that Mix-up improves our method's performance on smaller datasets, but introduce excess noise in larger datasets, which diminishes the model’s performance.
Original languageEnglish
Pages (from-to)2395-2405
Number of pages11
JournalInternational Journal of Machine Learning and Cybernetics
Issue number7
Early online date19 Jan 2023
Publication statusPublished - Jul 2023

Bibliographical note

Funding Information:
The research described in this article has been supported by a grant from the Research Grants Council of the Hong Kong Special Administrative Region, China (UGC/FDS16/E01/19), the Lam Woo Research Fund (LWP20019) and the Faculty Research Grants (DB22A5 and DB22B4) of Lingnan University, Hong Kong. The authors have no competing interests to declare that are relevant to the content of this article.

Publisher Copyright:
© 2023, The Author(s), under exclusive licence to Springer-Verlag GmbH Germany, part of Springer Nature.


  • Label distribution learning
  • Mix-up
  • Text classification


Dive into the research topics of 'Improving text classification via a soft dynamical label strategy'. Together they form a unique fingerprint.

Cite this