Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/1394281.1394283acmconferencesArticle/Chapter ViewAbstractPublication PagesapgvConference Proceedingsconference-collections
research-article

The effects of virtual reality, augmented reality, and motion parallax on egocentric depth perception

Published: 09 August 2008 Publication History

Abstract

As the use of virtual and augmented reality applications becomes more common, the need to fully understand how observers perceive spatial relationships grows more critical. One of the key requirements in engineering a practical virtual or augmented reality system is accurately conveying depth and layout. This requirement has frequently been assessed by measuring judgments of egocentric depth. These assessments have shown that observers in virtual reality (VR) perceive virtual space as compressed relative to the real-world, resulting in systematic underestimations of egocentric depth. Previous work has indicated that similar effects may be present in augmented reality (AR) as well.
This paper reports an experiment that directly measured egocentric depth perception in both VR and AR conditions; it is believed to be the first experiment to directly compare these conditions in the same experimental framework. In addition to VR and AR, two control conditions were studied: viewing real-world objects, and viewing real-world objects through a head-mounted display. Finally, the presence and absence of motion parallax was crossed with all conditions. Like many previous studies, this one found that depth perception was underestimated in VR, although the magnitude of the effect was surprisingly low. The most interesting finding was that no underestimation was observed in AR.

References

[1]
Barnett, V., and Lewis, T. 1994. Outliers in Statistical Data, 3rd ed. Wiley Series in Probability & Statistics. John Wiley & Sons.
[2]
Beall, A. C., Loomis, J. M., Philbeck, J. W., and Fikes, T. G. 1995. Absolute motion parallax weakly determines visual scale in real and virtual environments. In Proceedings SPIE Human Vision, Visual Processing, and Digital Display, SPIE, 288--297.
[3]
Durgin, F. H., Fox, L. F., Lewis, J., and Walley, K. A. 2002. Perceptuomotor adaptation: More than meets the eye. Abstracts of the Psychonomic Society 7, 103--104.
[4]
Hu, H. H., Gooch, A. A., Thompson, W. B., Smits, B. E., Rieser, J. J., and Shirley, P. 2000. Visual cues for imminent object contact in realistic virtual environments. In Proceedings IEEE Visualization 2000, IEEE, 179--185.
[5]
Interrante, V., Anderson, L., and Ries, B. 2006. Distance perception in immersive virtual environments, revisited. In IEEE Virtual Reality 2006, IEEE.
[6]
Knapp, J. M. 1999. The Visual Perception of Egocentric Distance in Virtual Environments. PhD thesis, University of California, Santa Barbara, Department of Psychology.
[7]
Livingston, M. A., Zanbaka, C., Swan II, J. E., and Smallman, H. S. 2005. Objective measures for the effectiveness of augmented reality. In IEEE Virtual Reality 2005 (VR 05), IEEE, 287--288.
[8]
Loomis, J. M., and Knapp, J. M. 2003. Visual perception of egocentric distance in real and virtual environments. In Virtual and Adaptive Environments: Applications, Implications, and Human Performance Issues, L. J. Hettinger and J. W. Haas, Eds. Lawrence Erlbaum Associates, Mahwah, NJ, USA, 21--46.
[9]
Messing, R., and Durgin, F. H. 2005. Distance perception and the visual horizon in head-mounted displays. ACM Transactions on Applied Perception 2, 3, 234--250.
[10]
Plumert, J. M., Kearney, J. K., Cremer, J. F., and Recker, K. 2005. Distance perception in real and virtual environments. ACM Transactions on Applied Perception 2, 3, 216--233.
[11]
Rolland, J. P., Gibson, W., and Ariely, D. 1995. Towards quantifying depth and size perception in virtual environments. Presence: Teleoperators and Virtual Environments 4, 1, 24--49.
[12]
Sekuler, R., and Blake, R. 2001. Perception, 4th ed. McGraw-Hill.
[13]
Swan II, J. E., Livingston, M. A., Smallman, H. S., Brown, D., Baillot, Y., Gabbard, J. L., and Hix, D. 2006. A perceptual matching technique for depth judgments in optical, see-through augmented reality. In IEEE Virtual Reality 2006, IEEE, 19--26.
[14]
Swan II, J. E., Jones, A., Kolstad, E., Livingston, M. A., and Smallman, H. S. 2007. Egocentric depth judgments in optical, see-through augmented reality. IEEE Transactions on Visualization and Computer Graphics (TVCG) 13, 3, 429--442.
[15]
Thompson, W. B., Willemsen, P., Gooch, A. A., Creem-Regehr, S. H., Loomis, J. M., and Beall, A. C. 2004. Does the quality of the computer graphics matter when judging distances in visually immersive environments? Presence: Teleoperators and Virtual Environments 13, 5, 560--571.
[16]
Willemsen, P., and Gooch, A. A. 2002. Perceived egocentric distances in real, image-based, and traditional virtual environments. In Proceedings IEEE Virtual Reality 2002, IEEE.
[17]
Willemsen, P., Colton, M. B., Creem-Regehr, S. H., and Thompson, W. B. 2004. The effects of head-mounted display mechanics on distance judgments in virtual environments. In 1st Symposium on Applied Perception in Graphics and Visualization, vol. 73 of ACM International Conference Proceedings Series, ACM, 35--38.
[18]
Witmer, B. G., and Sadowski, W. 1998. Nonvisually guided locomotion to a previously viewed target in real and virtual environments. Human Factors 40, 3, 478--488.
[19]
Ziemer, C., Plumert, J. M., Cremer, J. F., and Kearney, J. K. 2006. Making distance judgments in real and virtual environments: Does order make a difference? In 3rd Symposium on Applied Perception in Graphics and Visualization, ACM, 153.

Cited By

View all
  • (2024)Decline in Sensory Integration in Old Age and Its Related Functional Brain Connectivity Correlates Observed during a Virtual Reality TaskBrain Sciences10.3390/brainsci1408084014:8(840)Online publication date: 21-Aug-2024
  • (2024)Support Lines and Grids for Depth Ordering in Indoor Augmented Reality using Optical See-Through Head-Mounted DisplaysProceedings of the 2024 ACM Symposium on Spatial User Interaction10.1145/3677386.3682097(1-11)Online publication date: 7-Oct-2024
  • (2024)Invisible Mesh: Effects of X-Ray Vision Metaphors on Depth Perception in Optical-See-Through Augmented Reality2024 IEEE Conference Virtual Reality and 3D User Interfaces (VR)10.1109/VR58804.2024.00059(376-386)Online publication date: 16-Mar-2024
  • Show More Cited By

Recommendations

Reviews

Felix Hamza-Lup

Depth perception is an interesting area of research in virtual environments. With the advent of augmented reality (AR) applications, it is increasingly being studied. The complexity of an AR environment is due to the difficulty of the calibration procedure required to bring the virtual components in register (that is, in correspondence) with the real environment. Even without stereoscopic vision, humans can still perceive depth based on a myriad of depth cues interpreted by the brain. Research studies show that depth cues are important for accurate estimation of the location of objects in pure virtual reality (VR) as well as in AR environments. In this paper, the authors describe an experiment that targets the estimation of human egocentric depth perception in VR and AR. A common technique for measuring an observer's judgment of egocentric depth is a visually directed walk. See-through head-mounted displays (HMD) are used as three-dimensional (3D) visualization systems and a six-degrees-of-freedom tracking system. All experimental conditions are tested along with motion parallax to determine whether this could be an additional cue for more accurate depth orientation. Additionally, three major calibration procedures are performed. The experimental setup includes a hallway with a white pyramid being the target object, positioned along the ground plane at a distance of two to eight meters from the observer. A group of 16 observers of different ages and genders participated in the study, with five practice trials prior to the experimental runs. The results of the experiment show that the observers underestimated depth perception in VR to a lesser degree than stated in previous research. No depth underestimation took place in the AR setup. Motion parallax only had a significant effect in the case of the real+HMD setup (that is, observers see the real object through the HMD) and, contrary to the authors' expectations, caused observers to underestimate the depth. Online Computing Reviews Service

Access critical reviews of Computing literature here

Become a reviewer for Computing Reviews.

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
APGV '08: Proceedings of the 5th symposium on Applied perception in graphics and visualization
August 2008
209 pages
ISBN:9781595939814
DOI:10.1145/1394281
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 09 August 2008

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. augmented reality
  2. depth perception
  3. motion parallax
  4. virtual reality

Qualifiers

  • Research-article

Funding Sources

Conference

APGV08
Sponsor:

Acceptance Rates

Overall Acceptance Rate 19 of 33 submissions, 58%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)182
  • Downloads (Last 6 weeks)22
Reflects downloads up to 19 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Decline in Sensory Integration in Old Age and Its Related Functional Brain Connectivity Correlates Observed during a Virtual Reality TaskBrain Sciences10.3390/brainsci1408084014:8(840)Online publication date: 21-Aug-2024
  • (2024)Support Lines and Grids for Depth Ordering in Indoor Augmented Reality using Optical See-Through Head-Mounted DisplaysProceedings of the 2024 ACM Symposium on Spatial User Interaction10.1145/3677386.3682097(1-11)Online publication date: 7-Oct-2024
  • (2024)Invisible Mesh: Effects of X-Ray Vision Metaphors on Depth Perception in Optical-See-Through Augmented Reality2024 IEEE Conference Virtual Reality and 3D User Interfaces (VR)10.1109/VR58804.2024.00059(376-386)Online publication date: 16-Mar-2024
  • (2024)Visual Perceptual Confidence: Exploring Discrepancies Between Self-reported and Actual Distance Perception In Virtual RealityIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2024.345616530:11(7245-7254)Online publication date: 1-Nov-2024
  • (2024)Assessing Depth Perception in VR and Video See-Through AR: A Comparison on Distance Judgment, Performance, and PreferenceIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2024.337206130:5(2140-2150)Online publication date: 4-Mar-2024
  • (2024)Human Robot Collaboration in Surgery: Communication Interface and Interaction Design2024 MIT Art, Design and Technology School of Computing International Conference (MITADTSoCiCon)10.1109/MITADTSoCiCon60330.2024.10575199(1-5)Online publication date: 25-Apr-2024
  • (2024)Real and virtual environments have comparable spatial memory distortions after scale and geometric transformationsSpatial Cognition & Computation10.1080/13875868.2024.230301624:2(115-143)Online publication date: 14-Feb-2024
  • (2023)Limb loading enhances skill transfer between augmented and physical reality tasks during limb loss rehabilitationJournal of NeuroEngineering and Rehabilitation10.1186/s12984-023-01136-520:1Online publication date: 27-Jan-2023
  • (2023)The Perceptual Science of Augmented RealityAnnual Review of Vision Science10.1146/annurev-vision-111022-1237589:1(455-478)Online publication date: 15-Sep-2023
  • (2023)Effects of Vibrotactile Feedback on Aim-and-Throw Tasks in Virtual EnvironmentsProceedings of the 29th ACM Symposium on Virtual Reality Software and Technology10.1145/3611659.3616894(1-2)Online publication date: 9-Oct-2023
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media