Evaluating Visual Quality of Autostereoscopic Displays Using an Interactive Perception Network

Zhaoqing PAN, Jiaojiao YI, Bo PENG, Jianjun LEI, Fu Lee WANG, Nam LING, Sam KWONG

Research output: Journal PublicationsJournal Article (refereed)peer-review

Abstract

Visual quality assessment of autostereoscopic displays aims to evaluate the stereoscopic visual experience they provide to the viewer, which is crucial for quantifying and optimizing the performance of autostereoscopic displays. In order to automatically evaluate the visual quality of the display, the relationship between the visual quality and the perceptual crosstalk was investigated because the perceptual crosstalk could degrade the visual quality of autostereoscopic displays. However, the visual quality of autostereoscopic displays is comprehensively determined by a series of display parameters such as crosstalk, screen resolution, screen size, etc., a single display parameter cannot accurately reflect the visual quality. To address this problem, an Interactive Perception Network (IPNet)-based visual quality assessment method is proposed in this paper, which models the relationship between the visual quality of the display and the display parameters to accurately evaluate the visual quality of autostereoscopic displays. Since the crosstalk level perceived by the human eye is highly correlated with color differences, a color-informed crosstalk aware module is proposed, which learns features that are highly related to the perceptual crosstalk under the guidance of color difference information. To further analyze the visual impacts of other perceptual fidelity parameters, such as screen resolution, screen size, viewing angle, etc., a multi-domain binocular interaction module is proposed, which models the binocular interaction process in both channel and spatial domains to achieve fine-grained binocular semantic understanding. To train the proposed IPNet for evaluating the visual quality of autostereoscopic displays, a well-annotated largescale dataset called TJU autostereoscopic display dataset is created, which contains 800 simulated autostereoscopic displays with different display parameters. Extensive experimental results demonstrate that the proposed IPNet achieves state-of-the-art performance in evaluating the visual quality of autostereoscopic displays.
Original languageEnglish
JournalIEEE Transactions on Instrumentation and Measurement
Early online date28 Apr 2025
DOIs
Publication statusE-pub ahead of print - 28 Apr 2025

Bibliographical note

Publisher Copyright:
© 1963-2012 IEEE.

Keywords

  • Autostereoscopic display
  • color-informed crosstalk aware module
  • display parameter
  • multi-domain binocular interaction module
  • visual quality assessment

Fingerprint

Dive into the research topics of 'Evaluating Visual Quality of Autostereoscopic Displays Using an Interactive Perception Network'. Together they form a unique fingerprint.

Cite this