User-Generated Video Quality Assessment: A Subjective and Objective Study

Yang LI, Shengbin MENG, Xinfeng ZHANG, Meng WANG, Shiqi WANG, Yue WANG, Siwei MA*

*Corresponding author for this work

Research output: Journal PublicationsJournal Article (refereed)peer-review

19 Citations (Scopus)

Abstract

Recently, we have observed an exponential increase of user-generated content (UGC) videos. The distinguished characteristic of UGC videos originates from the video production and delivery chain, as they are usually acquired and processed by non-professional users before uploading to the hosting platforms for sharing. As such, these videos usually undergo multiple distortion stages that may affect visual quality before ultimately being viewed. Inspired by the increasing consensus that the optimization of the video coding and processing shall be fully driven by the perceptual quality, in this paper, we propose to study the quality of the UGC videos from both objective and subjective perspectives. We first construct a UGC video quality assessment (VQA) database, aiming to provide useful guidance for the UGC video coding and processing in the hosting platform. The database contains source UGC videos uploaded to the platform and their transcoded versions that are ultimately enjoyed by end-users, along with their subjective scores. Furthermore, we develop an objective quality assessment algorithm that automatically evaluates the quality of the transcoded videos based on the corrupted reference, which is in accordance with the application scenarios of UGC video sharing in the hosting platforms. The information from the corrupted reference is well leveraged and the quality is predicted based on the inferred quality maps with deep neural networks (DNN). Experimental results show that the proposed method yields superior performance. Both subjective and objective evaluations of the UGC videos also shed lights on the design of perceptual UGC video coding.
Original languageEnglish
Pages (from-to)154-166
Number of pages13
JournalIEEE Transactions on Multimedia
Volume25
Early online date29 Oct 2021
DOIs
Publication statusPublished - 2023
Externally publishedYes

Bibliographical note

This work was supported in part by the National Natural Science Foundation of China under Grants 62025101, 61961130392, and 61931014, in part by the National Key Research and Development Project under Grant 2019YFF0302703, in part by Hong KongITF UICP under Grant 9440203, in part by Fundamental Research Funds for the Central Universities, and PKU-Baidu Fund under Grant 2019BD003, and in part by High performance Computing Platform of Peking University.

Keywords

  • Deep neural network
  • user-generated content
  • video quality assessment

Fingerprint

Dive into the research topics of 'User-Generated Video Quality Assessment: A Subjective and Objective Study'. Together they form a unique fingerprint.

Cite this