DACNN : Blind Image Quality Assessment via A Distortion-Aware Convolutional Neural Network

Zhaoqing PAN, Hao ZHANG, Jianjun LEI, Yuming FANG, Xiao SHAO, Nam LING, Sam KWONG

Research output: Journal PublicationsJournal Article (refereed)peer-review

25 Citations (Scopus)


Deep neural networks have achieved great performance on blind Image Quality Assessment (IQA), but it is still challenging for using one network to accurately predict the quality of images with different distortions. In this paper, a Distortion-Aware Convolutional Neural Network (DACNN) is proposed for blind IQA, which works effectively for not only synthetically distorted images but also authentically distorted images. The proposed DACNN consists of a distortion aware module, a distortion fusion module, and a quality prediction module. In the distortion aware module, a Siamese network-based pretraining strategy is proposed to design a synthetic distortion-aware network for full learning the synthetic distortions, and an authentic distortion-aware network is used for extracting the authentic distortions. To efficiently fuse the learned distortion features, and make the network pay more attention to the essential features, a weight-adaptive fusion network is proposed to adaptively adjust the weight of each distortion. Finally, the quality prediction module is adopted to map the fused features to a quality score. Extensive experiments on four authentic IQA databases and four synthetic IQA databases have proved the effectiveness of the proposed DACNN.
Original languageEnglish
Pages (from-to)7518-7531
JournalIEEE Transactions on Circuits and Systems for Video Technology
Issue number11
Early online date7 Jul 2022
Publication statusPublished - Nov 2022
Externally publishedYes


  • Blind image quality assessment
  • distortion-aware network
  • fusion network
  • Siamese network


Dive into the research topics of 'DACNN : Blind Image Quality Assessment via A Distortion-Aware Convolutional Neural Network'. Together they form a unique fingerprint.

Cite this