UCL-Dehaze: Toward Real-World Image Dehazing via Unsupervised Contrastive Learning

  • Yongzhen WANG
  • , Xuefeng YAN*
  • , Fu Lee WANG
  • , Haoran XIE
  • , Wenhan YANG
  • , Xiao-Ping ZHANG
  • , Jing QIN
  • , Mingqiang WEI
  • *Corresponding author for this work

Research output: Journal PublicationsJournal Article (refereed)peer-review

Abstract

While the wisdom of training an image dehazing model on synthetic hazy data can alleviate the difficulty of collecting real-world hazy/clean image pairs, it brings the well-known domain shift problem. From a different yet new perspective, this paper explores contrastive learning with an adversarial training effort to leverage unpaired real-world hazy and clean images, thus alleviating the domain shift problem and enhancing the network’s generalization ability in real-world scenarios. We propose an effective unsupervised contrastive learning paradigm for image dehazing, dubbed UCL-Dehaze. Unpaired real-world clean and hazy images are easily captured, and will serve as the important positive and negative samples respectively when training our UCL-Dehaze network. To train the network more effectively, we formulate a new self-contrastive perceptual loss function, which encourages the restored images to approach the positive samples and keep away from the negative samples in the embedding space. Besides the overall network architecture of UCL-Dehaze, adversarial training is utilized to align the distributions between the positive samples and the dehazed images. Compared with recent image dehazing works, UCL-Dehaze does not require paired data during training and utilizes unpaired positive/negative data to better enhance the dehazing performance. We conduct comprehensive experiments to evaluate our UCL-Dehaze and demonstrate its superiority over the state-of-the-arts, even only 1,800 unpaired real-world images are used to train our network. Source code is publicly available at https://github.com/yz-wang/UCL-Dehaze.
Original languageEnglish
Pages (from-to)1361-1374
Number of pages14
JournalIEEE Transactions on Image Processing
Volume33
Early online date9 Feb 2024
DOIs
Publication statusPublished - 2024

Bibliographical note

Publisher Copyright:
© 1992-2012 IEEE.

Funding

This work was supported in part by the National Natural Science Foundation of China under Grant T2322012, Grant 62172218, and Grant 62032011; in part by the National Defense Basic Scientific Research Program of China under Grant JCKY2020605C003; in part by the Shenzhen Science and Technology Program under Grant JCYJ20220818103401003 and Grant JCYJ20220530172403007; in part by the Guangdong Basic and Applied Basic Research Foundation under Grant 2022A1515010170; in part by the General Research Fund of Hong Kong Research Grants Council under Grant 15218521; in part by the Lam Woo Research Fund under Grant LWP20019; in part by the Direct Grant of Lingnan University, Hong Kong under Grant DR23B2; and in part by the Faculty Research Grants of Lingnan University, Hong Kong, under Grant DB23A3 and Grant DB23B2.

UN SDGs

This output contributes to the following UN Sustainable Development Goals (SDGs)

  1. SDG 9 - Industry, Innovation, and Infrastructure
    SDG 9 Industry, Innovation, and Infrastructure

Keywords

  • UCL-Dehaze
  • adversarial training
  • image dehazing
  • unpaired data
  • unsupervised contrastive learning

Fingerprint

Dive into the research topics of 'UCL-Dehaze: Toward Real-World Image Dehazing via Unsupervised Contrastive Learning'. Together they form a unique fingerprint.

Cite this