PENet-KD : Progressive Enhancement Network via Knowledge Distillation for Rail Surface Defect Detection

Bingying WANG, Wujie ZHOU*, Weiqing YAN, Qiuping JIANG, Runmin CONG

*Corresponding author for this work

Research output: Journal PublicationsJournal Article (refereed)peer-review

2 Citations (Scopus)

Abstract

As an essential transportation system in modern society, the significance of railway track safety cannot be overlooked. In recent years, computer vision systems and deep learning have been increasingly applied to unserved track defect detection. Although several algorithms have been proposed to address safety concerns, there is a need to enhance their efficiency and accuracy. This study introduces an efficient progressive enhancement network via knowledge distillation (PENet-KD) for detecting defects on the rail surface. In PENet-KD, we utilize knowledge distillation to transfer the expertise of the teacher network to the student network, resulting in a lightweight model with high speed and accuracy. Additionally, two modules were developed to gradually refine features. Initially, cross-modal information is dynamically fused using a regenerative high-level attention module based on a graphical convolutional network, which corrects the features derived from the encoder. Subsequently, in the decoding stage, significant semantic guidance information is obtained by applying 3-D attentional optimization to the highest layer features, thereby guiding the progressive distillation module to produce precise outcomes. Extensive experiments conducted on an industrial RGB-D NEU detection of rail surface defect (RSDDS)-AUG benchmark dataset demonstrate that the proposed PENet-KD outperforms the existing state-of-the-art (SOTA) methods, thus showcasing its generality and effectiveness. Notably, on the RSDDS-AUG dataset, PENet-KD achieved a maximum E-measure gain of 1.4% and an S-measure gain of 1.2% compared to the best current method. The code and models utilized in this research are publicly available at https://github.com/Wang-5ying/PENet-KD.

Original languageEnglish
Article number5032811
Pages (from-to)1-11
Number of pages11
JournalIEEE Transactions on Instrumentation and Measurement
Volume72
Early online date6 Nov 2023
DOIs
Publication statusPublished - 2023
Externally publishedYes

Bibliographical note

Publisher Copyright:
© 2023 IEEE.

Keywords

  • Knowledge distillation
  • progressive distillation module
  • rail surface defect detection
  • revive attention advanced module
  • transformer

Fingerprint

Dive into the research topics of 'PENet-KD : Progressive Enhancement Network via Knowledge Distillation for Rail Surface Defect Detection'. Together they form a unique fingerprint.

Cite this