Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/1620993.1621021acmconferencesArticle/Chapter ViewAbstractPublication PagesapgvConference Proceedingsconference-collections
research-article

Depth judgment measures and occluders in near-field augmented reality

Published: 30 September 2009 Publication History

Abstract

This poster describes a tabletop-based experiment which studied two complimentary depth judgment protocols and the effect of an occluding surface on depth judgments in augmented reality (AR). The experimental setup (Figure 1) broadly replicated the setup described by Ellis and Menges [1998], and studied near-field distances between 30 and 60 centimeters. We collected data from six participants; we consider this to be a pilot study.
These distances are important for many AR applications that involve reaching and manipulating; examples include AR-assisted surgery and medical training devices, maintenance tasks, and table-top meetings where the participants are jointly interacting and manipulating shared virtual objects in the middle of the table. Some of these tasks involve "x-ray vision", where AR users perceive objects which are located behind solid, opaque surfaces.
Ellis and Menges [1998] studied tabletop distances using a setup similar to Figure 1. They used a closed-loop perceptual matching task to examine near-field distances of 0.4 to 1.0 meters, and studied the effects of an occluding surface (the x-ray vision condition), convergence, accommodation, observer age, and monocular, biocular, and stereo AR displays. They found that monocular viewing degraded the depth judgment, and that the x-ray vision condition caused a change in vergence angle which resulted in depth judgments being biased towards the observer. They also found that cutting a hole in the occluding surface, which made the depth of the virtual object physically plausible, reduced the depth judgment bias.
The experimental setup (Figure 1) involved a height-adjustable tabletop that allowed observers to easily reach both above and below the table. We used two complimentary dependent measures to assess depth judgments: we replicated the closed-loop matching task (Task = closed) of Ellis and Menges [1998]; observers manipulated a small light to match the depth of the bottom of a slowly rotating, upside-down pyramid (the target object). In addition, we used an open-loop blind reaching task (Task = open), in order to compare the closed-loop task to a more perceptually-motivated depth judgment. Our occluding surface was composed of circular foam-core covered with a highly-salient checkerboard pattern; when observers saw the occluder (Occluder = present, otherwise Occluder = absent) it was presented 10 cm in front of the target. We used a factorial, within-subjects experimental design; observers made binocular stereo depth judgments.
Figure 2 shows the results by task, occluder, and distance; the results are grouped by task for clarity, and should be judged relative to the 45° veridical lines. Figure 3 shows the results by task and occluder, expressed as normalized error = judged distance / veridical distance. All conditions underestimated the veridical distance of 100% to some degree. The closed-loop task replicated the finding of Ellis and Menges [1998]: the presence of the occluder biased the depth judgment towards the observer. The perceptually-based open-loop task resulted in greater underestimation; the larger error is unsurprising given that fewer depth cues are available in the open-loop task. Interestingly, in the open-loop condition observers judged the target to be farther when the occluder was present.
We consider this to be a pilot study; we plan to collect data from a larger number of participants and otherwise improve the experimental setup and design.

Reference

[1]
Ellis, S. R., and Menges, B. M. 1998. Localization of virtual objects in the near visual field. Human Factors 40, 3, 415--431.

Cited By

View all
  • (2023)Impact of motion cues, color, and luminance on depth perception in optical see-through AR displaysFrontiers in Virtual Reality10.3389/frvir.2023.12439564Online publication date: 6-Dec-2023
  • (2019)Depth perception in shuffleboard: Depth cues effect on depth perception in virtual and augmented reality systemJournal of the Society for Information Display10.1002/jsid.84028:2(164-176)Online publication date: 15-Sep-2019

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
APGV '09: Proceedings of the 6th Symposium on Applied Perception in Graphics and Visualization
September 2009
139 pages
ISBN:9781605587431
DOI:10.1145/1620993

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 30 September 2009

Permissions

Request permissions for this article.

Check for updates

Qualifiers

  • Research-article

Conference

APGV '09
Sponsor:
APGV '09: ACM Symposium on Applied Perception in Graphics and Visualization
September 30 - October 2, 2009
Chania, Crete, Greece

Acceptance Rates

Overall Acceptance Rate 19 of 33 submissions, 58%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)2
  • Downloads (Last 6 weeks)0
Reflects downloads up to 18 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2023)Impact of motion cues, color, and luminance on depth perception in optical see-through AR displaysFrontiers in Virtual Reality10.3389/frvir.2023.12439564Online publication date: 6-Dec-2023
  • (2019)Depth perception in shuffleboard: Depth cues effect on depth perception in virtual and augmented reality systemJournal of the Society for Information Display10.1002/jsid.84028:2(164-176)Online publication date: 15-Sep-2019

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media