Abstract

Back to Publications

Author(s) Saupe, D., Hahn, F., Hosu, V., Zingman, I., Rana, M., Li, S.
Title Crowd workers proven useful: A comparative study of subjective video quality assessment
Abstract We carried out crowdsourced video quality assessments using paired comparisons and converting the results to differential mean opinion scores (DMOS). A previous labbased study had provided corresponding MOS-values for absolute category ratings. Using a simple linear transformation to fit the crowdsourcing-based DMOS values to the lab-based MOS values, we compared the results in terms of correlation coefficients and visually checked the relationship on scatter plots. The comparison result is surprisingly good with correlation coefficients more than 0.96, although (1) the original video sequences had to be cropped and downscaled in the crowdsourcing-based experiments, (2) the control of the experimental setup for the crowdsourcing case was much less and (3) it was widely believed that data from crowdsourcing workers are less reliable. Our result suggests crowdsourcing workers can actually be used to collect reliable VQA data in some applications.
Download SaHaHo16.pdf