Perceptual Calibration for Immersive Display Environments
IEEE Transactions on Visualization and Computer Graphics, Volume 19, Number 4, page 691-700 — April 2013
The perception of objects, depth, and distance has been repeatedly shown to be divergent between virtual and physical
environments. We hypothesize that many of these discrepancies stem from incorrect geometric viewing parameters, specifically that
physical measurements of eye position are insufficiently precise to provide proper viewing parameters. In this paper, we introduce a
perceptual calibration procedure derived from geometric models. While most research has used geometric models to predict perceptual
errors, we instead use these models inversely to determine perceptually correct viewing parameters. We study the advantages of
these new psychophysically determined viewing parameters compared to the commonly used measured viewing parameters in an
experiment with 20 subjects. The perceptually calibrated viewing parameters for the subjects generally produced new virtual eye
positions that were wider and deeper than standard practices would estimate. Our study shows that perceptually calibrated viewing
parameters can significantly improve depth acuity, distance estimation, and the perception of shape.
Images and movies
BibTex references
@Article{PGRS13, author = "Ponto, Kevin and Gleicher, Michael and Radwin, Robert and Shin, Hyun Joon", title = "Perceptual Calibration for Immersive Display Environments", journal = "IEEE Transactions on Visualization and Computer Graphics", number = "4", volume = "19", pages = "691-700", month = "April", year = "2013", pmcid = "23428454 ", ee = "http://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=6479210", doi = "10.1109/TVCG.2013.36", url = "http://graphics.cs.wisc.edu/Papers/2013/PGRS13" }