projects
Perceptual visual quality metrics
ColorVideoVDP: A visual difference predictor for image, video and display distortions
Rafał K. Mantiuk, Param Hanji, Maliha Ashraf, Yuta Asano, Alexandre Chapiro
SIGGRAPH, ACM TOG—Transactions on Graphics (2024)
ColorVideoVDP is a video and image quality metric that models spatial and temporal aspects of vision for both luminance and color. The metric is built on novel psychophysical models of chromatic spatiotemporal contrast sensitivity and cross-channel contrast masking. It accounts for the viewing conditions, geometric, and photometric characteristics of the display. It was trained to predict common video-streaming distortions (e.g., video compression, rescaling, and transmission errors) and also 8 new distortion types related to AR/VR displays (e.g., light source and waveguide non-uniformities). To address the latter application, we collected our novel XR-Display-Artifact-Video quality dataset (XR-DAVID), comprised of 336 distorted videos. Extensive testing on XR-DAVID, as well as several datasets from the literature, indicate a significant gain in prediction performance compared to existing metrics. ColorVideoVDP opens the doors to many novel applications that require the joint automated spatiotemporal assessment of luminance and color distortions, including video streaming, display specification, and design, visual comparison of results, and perceptually-guided quality optimization. The code for the metric can be found at the project website.
Rafał K. Mantiuk, Param Hanji, Maliha Ashraf, Yuta Asano, Alexandre Chapiro
SIGGRAPH, ACM TOG—Transactions on Graphics (2024)
ColorVideoVDP is a video and image quality metric that models spatial and temporal aspects of vision for both luminance and color. The metric is built on novel psychophysical models of chromatic spatiotemporal contrast sensitivity and cross-channel contrast masking. It accounts for the viewing conditions, geometric, and photometric characteristics of the display. It was trained to predict common video-streaming distortions (e.g., video compression, rescaling, and transmission errors) and also 8 new distortion types related to AR/VR displays (e.g., light source and waveguide non-uniformities). To address the latter application, we collected our novel XR-Display-Artifact-Video quality dataset (XR-DAVID), comprised of 336 distorted videos. Extensive testing on XR-DAVID, as well as several datasets from the literature, indicate a significant gain in prediction performance compared to existing metrics. ColorVideoVDP opens the doors to many novel applications that require the joint automated spatiotemporal assessment of luminance and color distortions, including video streaming, display specification, and design, visual comparison of results, and perceptually-guided quality optimization. The code for the metric can be found at the project website.
Visible Difference Predictors: A Class of Perception-Based Metrics
Alexandre Chapiro, Param Hanji, Maliha Ashraf, Yuta Asano, Rafał K. Mantiuk
SID Symposium Digest of Technical Papers (2024)
The Visible Difference Predictors are a class of data driven, white box, efficiently implemented image or video difference metrics. They model important aspects of perception like spatial and temporal vision, foveation, and more, and are calibrated on datasets relevant for display and graphics applications. In this paper, we present a historic retrospective of VDPs, and a high-level technical overview and comparison to other metrics in the literature. Finally, we put forward a practical guide for selecting the appropriate metric for a given engineering problem and discuss how metrics can be effectively combined with subjective testing for high-confidence assessments.
Alexandre Chapiro, Param Hanji, Maliha Ashraf, Yuta Asano, Rafał K. Mantiuk
SID Symposium Digest of Technical Papers (2024)
The Visible Difference Predictors are a class of data driven, white box, efficiently implemented image or video difference metrics. They model important aspects of perception like spatial and temporal vision, foveation, and more, and are calibrated on datasets relevant for display and graphics applications. In this paper, we present a historic retrospective of VDPs, and a high-level technical overview and comparison to other metrics in the literature. Finally, we put forward a practical guide for selecting the appropriate metric for a given engineering problem and discuss how metrics can be effectively combined with subjective testing for high-confidence assessments.
XR-DAVID: XR Display Artifact Video Dataset
Alexandre Chapiro, Yuta Asano, Rafał K. Mantiuk, Param Hanji, Maliha Ashraf
Dataset repository (2024)
A video quality dataset with XR (AR/VR) display distortions was created to measure the effect of display distortions, such as colour fringes or dithering, on image quality.
Alexandre Chapiro, Yuta Asano, Rafał K. Mantiuk, Param Hanji, Maliha Ashraf
Dataset repository (2024)
A video quality dataset with XR (AR/VR) display distortions was created to measure the effect of display distortions, such as colour fringes or dithering, on image quality.