Nothing Special   »   [go: up one dir, main page]

skip to main content
research-article

MRTouch: Adding Touch Input to Head-Mounted Mixed Reality

Published: 01 April 2018 Publication History

Abstract

We present MRTouch, a novel multitouch input solution for head-mounted mixed reality systems. Our system enables users to reach out and directly manipulate virtual interfaces affixed to surfaces in their environment, as though they were touchscreens. Touch input offers precise, tactile and comfortable user input, and naturally complements existing popular modalities, such as voice and hand gesture. Our research prototype combines both depth and infrared camera streams together with real-time detection and tracking of surface planes to enable robust finger-tracking even when both the hand and head are in motion. Our technique is implemented on a commercial Microsoft HoloLens without requiring any additional hardware nor any user or environmental calibration. Through our performance evaluation, we demonstrate high input accuracy with an average positional error of 5.4 mm and 95% button size of 16 mm, across 17 participants, 2 surface orientations and 4 surface materials. Finally, we demonstrate the potential of our technique to enable on-world touch interactions through 5 example applications.

References

[1]
M. Adcock, M. Hutchins, C. Gunn. "Haptic Collaboration with Augmented Reality," ACM SIGGRAPH 2004 Posters (SIGGRAPH '04), pp. 41, 2004.
[2]
B. Araujo, R. Jota, V. Perumal, J.X. Yao, K. Singh and D. Wigdor. "Snake Charmer: Physically Enabling Virtual Objects," Proc.10<sup>th</sup> Int. Conf. Tangible, Embedded and Embodied Interaction (TEI '16), pp. 218–226. 2016.
[3]
M. Azmandian, M. Hancock, H. Benko, E. Ofek and A.D. Wilson. "Haptic Retargeting: Dynamic Repurposing of Passive Haptics for Enhanced Virtual Reality Experiences," Proc. SIGCHI Conf. Human Factors in Computing Systems (CHI '16), pp. 1968–1979, 2016.
[4]
M. Bâce, T. Leppänen, D.G. de Gomez and A.R. Gomez. "ubiGaze: ubiquitous augmented reality messaging using gaze gestures," Proc. SIGGRAPH ASIA 2016 Mobile Graphics and Interactive Applications (SA '16), Article 11, pp.5 pages, 2016.
[5]
H. Benko, C. Holz, M. Sinclair and E. Ofek. "NormalTouch and TextureTouch: High-fidelity 3D Haptic Shape Rendering on Handheld Virtual Reality Controllers," Proc. 29<sup>th</sup> Ann. Symp. User Interface Software and Technology (UIST '16), pp. 717–728, 2016.
[6]
H. Benko, E.W. Ishak and S. Feiner. "Cross-Dimensional Gestural Interaction Techniques for Hybrid Immersive Environments," Proc. IEEE Virtual Reality (VR 2005), pp. 209–116, 2005.
[7]
M. Billinghurst, H. Kato and I. Poupyrev. "Collaboration with tangible augmented reality interfaces," Proc. HCI Int., pp. 234–241, 2004.
[8]
R.A. Bolt. "Put-that-there: Voice and gesture at the graphics interface," Proc. 7th Ann. Conf. Computer graphics and interactive techniques (SIGGRAPH '80), pp. 262–270, 1980.
[9]
G. Burdea, J. Zhuang, E. Roskos, D. Silver and N. Langrana. "A portable dextrous master with force feedback," Presence: Teleoper. Virtual Environ . Volume 1, Issue 1 (1992), pp. 18–28.
[10]
M.C. Cabral, C.H. Morimoto and M.K. Zuffo. "On the usability of gesture interfaces in virtual reality environments," Proc. Lat. Am. Conf. Human-computer interaction (CLIHC '05), pp. 100–108. 2005.
[11]
J. Canny. "A Computational Approach to Edge Detection," IEEE Trans. Pattern Anal. Mach. Intell. Volume 8, Issue 6 (1986), pp. 679–698.
[12]
T. Carter, S.A. Seah, B. Long, B. Drinkwater and S. Subramanian. "UltraHaptics: multi-point mid-air haptic feedback for touch surfaces," Proc. 26<sup>th</sup> Ann. Symp. User Interface Software and Technology (UIST '13), pp. 505–514, 2013.
[13]
J.S. Chang, E.Y. Kim, KC Jung and H.J. Kim, "Real time hand tracking based on active contour model," Proc. Int. Conf. Computational Science and Applications (ICCSA '05), pp. 999–1006, 2005.
[14]
I. Chatterjee, R. Xiao and C. Harrison. "Gaze+Gesture: Expressive, Precise and Targeted Free-Space Interactions," Proc. ACM Int. Conf. Multimodal Interaction (ICMI '15), pp. 131–138, 2015.
[15]
X. Chen, J. Schwarz, C. Harrison, J. Mankoff and S.E. Hudson. "Air+touch: interweaving touch & in-air gestures," Proc. 27<sup>th</sup> Ann. Symp. User Interface Software and Technology (UIST '14), pp. 519–525, 2014.
[16]
K. Dorfmuller-Ulhaas and D. Schmalstieg. "Finger tracking for interaction in augmented environments," Proc. IEEE and ACM Int. Symp. Augmented Reality, pp. 55–64, 2001.
[17]
M.A. Fischler and R.C. Bolles. "Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography," Commun. ACM Volume 24, Issue 6 (1981), pp. 381–395.
[18]
T.B. Fitzpatrick. "Soleil et peau," J Med Esthet Volume 2 . Issue 7 (1975): pp. 33–34.
[19]
J. Gugenheimer, D. Dobbelstein, C. Winkler, G. Haas and E. Rukzio. "FaceTouch: Enabling Touch Interaction in Display Fixed UIs for Mobile Virtual Reality," Proc. 29<sup>th</sup> Ann. Symp. User Interface Software and Technology (UIST '16), pp. 49–60, 2016.
[20]
M. Hachet, B. Bossavit, A. Cohé and J-B de la Rivière. "Toucheo: multitouch and stereo combined in a seamless workspace," Proc. 24<sup>th</sup> Ann. Symp. User Interface Software and Technology (UIST '11), pp. 587–592, 2011.
[21]
J.Y. Han. "Low-cost multi-touch sensing through frustrated total internal reflection," Proc.18<sup>th</sup> Ann. Symp. User Interface Software and Technology (UIST '05), pp. 115–118, 2005.
[22]
C. Harrison, H. Benko and A.D. Wilson. "OmniTouch: wearable multitouch interaction everywhere," Proc. 24<sup>th</sup> Ann. Symp. User Interface Software and Technologv (UIST' 11), pp. 441–450, 2011.
[23]
A. Hettiarachchi and D. Wigdor. "Annexing Reality: Enabling Opportunistic Use of Everyday Objects as Tangible Proxies in Augmented Reality," Proc. SIGCHI Conf. Human Factors in Computing Systems (CHI '16), pp. 1957–1967, 2016.
[24]
J.D. Hincapié-Ramos, X. Guo, P. Moghadasian and P. Irani. "Consumed endurance: a metric to quantify arm fatigue of mid-air interactions," Proc. SIGCHI Conf. Human Factors in Computing Systems (CHI '14), pp. 1063–1072, 2014.
[25]
K. Hinckley, S. Heo, M. Pahud, C. Holz, H. Benko, A. Sellen, R. Banks, K. O'Hara, G. Smyth and W. Buxton. "Pre-Touch Sensing for Mobile Interaction," Proc. 2016 SIGCHI Conf. Human Factors in Computing Systems (CHI '16), pp. 2869–2881, 2016.
[26]
C. Holz and P. Baudisch. "The generalized perceived input point model and how to double touch accuracy by extracting fingerprints," Proc. SIGCHI Conf. Human Factors in Computing Systems (CHI '10), pp. 581–590, 2010.
[27]
. HTC Vive Controller . https://www.vive.com/us/accessory/controller/.
[28]
W. Hürst and C. van Wezel. "Gesture-based Interaction via Finger Tracking for Mobile Augmented Reality," Multimedia Tools and Applications Volume 62, Issue 1 (), pp. 233–258.
[29]
H. Ishii and B. Ullmer "Tangible bits: towards seamless interfaces between people, bits and atoms," Proc. SIGCHI Conf. Human Factors in Computing Systems (CHI '97), pp. 234–241.
[30]
S. Izadi, D. Kim, O. Hilliges, D. Molyneaux, R. Newcombe, P. Kohli, J. Shotton, S. Hodges, D. Freeman, A. Davison and A. Fitzgibbon. "KinectFusion: real-time 3D reconstruction and interaction using a moving depth camera," Proc. 24th Ann. Symp. User Interface Software and Technology (UIST '11), pp. 559–568, 2011.
[31]
R. Jota, A. Ng, P. Dietz and D. Wigdor. "How fast is fast enough?: a study of the effects of latency in direct-touch pointing tasks," Proc. SIGCHI Conf. Human Factors in Computing Systems (CHI '13), pp. 2291–2300, 2013.
[32]
H. Koike, Y. Sato and Y. Kobayashi. "Integrating paper and digital information on EnhancedDesk: a method for realtime finger tracking on an augmented desk system," ACM Trans. Comput.-Hum. Interact . Volume 8, Issue 4 (2001), pp. 307–322.
[33]
Oliver Kreylos. Vrui VR Toolkit . http.//idav.ucdavis.edu/~okrey-los/ResDev/Vrui.
[34]
Leap Motion Mobile VR Platform . https://www.leapmotion.com/product/vr.
[35]
SK Lee, W. Buxton and K.C. Smith. "A multi-touch three dimensional touch-sensitive tablet," Proc. SIGCHI Conf. Human Factors in Computing Systems (CHI '85), pp. 21–25, 1985.
[36]
T. Lee and T. Hollerer. "Handy AR: Markerless Inspection of Augmented Reality Objects Using Fingertip Tracking," Proc. 11th IEEE Int. Symp. Wearable Computers (ISWC '07), pp. 1–8, 2007.
[37]
J. Letessier and F. Bérard. "Visual tracking of bare fingers for interactive surfaces," Proc. 17<sup>th</sup> Ann. Symp. User Interface Software and Technology (UIST '04), pp. 119–122, 2004.
[38]
R.W. Lindeman, R. Page, Y. Yanagida and J.L. Sibert. "Towards full-body haptic feedback: the design and deployment of a spatialized vibrotactile feedback system," Proc. ACM Symp. Virtual Reality Software and Technology (VRST '04), pp. 146–149. 2004.
[39]
W.R. Mark, L. McMillan, G. Bishop. "Post-Rendering 3D Warping," Proc. Symp. on Interactive 3D Graphics (I3D '97), pp. 7–16. 1997.
[40]
T.H. Massie and J.K. Salisbury. "The PHANToM Haptic Interface: A Device for Probing Virtual Objects," ASME Winter Annual Meeting, DSC -Vol. Volume 55–1, pp. 295–300, 1994.
[41]
D. Medeiros, L. Teixeira, F. Carvalho, I. Santos and A. Raposo. "A tablet-based 3D interaction tool for virtual engineering environments," Proc.12<sup>th</sup> ACM SIGGRAPH Int. Conf. on Virtual-Reality Continuum and Its Applications in Industry (VRCAI '13), pp. 211–218. 2013.
[42]
. Kinect hardware . https://developer.microsoft.com/en-us/windows/kinect/hardware.
[43]
. Microsoft HoloLens . https://www.microsoft.com/microsoft-hololens/.
[44]
. Mixed Reality-Voice Input . https://developer.microsoft.com/windows/mixed-reality/voice_input.
[45]
. Use the HoloLens clicker . https://support.microsoft.com/help/12646.
[46]
. Use your Xbox Wireless Controller on Samsung Gear VR . https://support.xbox.com/en-US/xbox-one/accessories/use-samsung-gear-vr-with-xbox-controller.
[47]
A. Ng, J. Lepinski, D. Wigdor, S. Sanders and P. Diet. "Designing for low-latency direct-touch input," Proc. 25<sup>th</sup> Ann. Symp. User Interface Software and Technology (UIST '12), pp. 453–464, 2012.
[48]
Oculus VR . "Asynchronous TimeWarp (ATW) ." https://developer.oculus.com/documentation/mobilesdk/latest/concepts/mobile-time-warp-overview/.
[49]
J.A. Paradiso, K. Hsiao, J. Strickon, J. Lifton and A. Adler. "Sensor systems for interactive surfaces," IBM Syst. J. Volume 39, Issue 3–4 (2000), pp. 892–914.
[50]
J.A. Paradiso, C.K. Leo, N. Checka, and K. Hsiao. "Passive acoustic sensing for tracking knocks atop large interactive displays," Proc. IEEE Sensors '02, pp. 521–527. 2002.
[51]
H.M. Park, S.H. Lee and J.S. Choi. "Wearable augmented reality system using gaze interaction," Proc. IEEE/ACM Int. Symp. Mixed and Augmented Reality (ISMAR'08), pp. 175–176, 2008.
[52]
E.N. Saba, E.C. Larson and S.N. Patel. "Dante vision: In-air and touch gesture sensing for natural surface interaction with combined depth and thermal cameras," Proc. IEEE Emerging Signal Processing Applications (ESPA '12), pp. 167–170, 2012.
[53]
. Gear VR: About the Touchpad . http://www.samsung.com/au/support/skp/faq/1073201.
[54]
R. Sodhi, I. Poupyrev, M. Glisson and A. Israr. "AIREAL: interactive tactile experiences in free air," ACM Trans. Graph. Volume 32, Issue 4, Article 134 (2013), pp.10 pages.
[55]
J.A. Walsh, S. von Itzstein and B.H. Thomas. "Ephemeral Interaction using Everyday Objects," Proc. 15<sup>th</sup> Australasian User Interface Conference (AUIC '14), pp. 29–37, 2014.
[56]
D. Wei, S.Z. Zhou and D. Xie. "MTMR: A conceptual interior design framework integrating Mixed Reality with the Multi-Touch tabletop interface," Proc. IEEE Int. Symp. Mixed and Augmented Reality (ISMAR'10), pp. 279–280, 2010.
[57]
A.D. Wilson. "Depth sensing video cameras for 3D tangible tabletop interaction," Proc. 2<sup>nd</sup> Int. Wkshp. Horizontal Interactive Human-Computer Systems (Tabletop '07), pp. 201–204, 2007.
[58]
R. Xiao, C. Harrison and S.E. Hudson. "WorldKit: rapid and easy creation of ad-hoc interactive applications on everyday surfaces," Proc. SIGCHI Conf. Human Factors in Computing Systems (CHI '13), pp. 879–888. 2013.
[59]
R. Xiao, S.E. Hudson and C. Harrison. "DIRECT: Making Touch Tracking on Ordinary Surfaces Practical with Hybrid Depth-Infrared Sensing," Proc. 2016 ACM Int. Conf. Interactive Surfaces and Spaces (ISS '16), pp. 85–94, 2016.
[60]
R. Xiao, G. Lew, J. Marsanico, D. Hariharan, S.E. Hudson and C. Harrison. "Toffee: enabling ad hoc, around-device interaction with acoustic time-of-arrival correlation," Proc. 16<sup>th</sup> Int. Conf. Human-computer interaction with mobile devices & services (MobileHCI 14), pp. 67–76, 2014.
[61]
F. Zhou, H.B-L. Duh and M. Billinghurst. "Trends in augmented reality tracking, interaction and display: A review of ten years of ISMAR," Proc. 7<sup>th</sup> IEEE/ACM Int. Symp. Mixed and Augmented Reality (ISMAR'08), pp. 193–202, 2008.

Cited By

View all
  • (2024)AdapTUI: Adaptation of Geometric-Feature-Based Tangible User Interfaces in Augmented RealityProceedings of the ACM on Human-Computer Interaction10.1145/36981278:ISS(44-69)Online publication date: 24-Oct-2024
  • (2024)MoiréTag: A Low-Cost Tag for High-Precision Tangible Interactions without Active ComponentsProceedings of the ACM on Human-Computer Interaction10.1145/36981138:ISS(1-19)Online publication date: 24-Oct-2024
  • (2024)SoundScroll: Robust Finger Slide Detection Using Friction Sound and Wrist-Worn MicrophonesProceedings of the 2024 ACM International Symposium on Wearable Computers10.1145/3675095.3676614(63-70)Online publication date: 5-Oct-2024
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image IEEE Transactions on Visualization and Computer Graphics
IEEE Transactions on Visualization and Computer Graphics  Volume 24, Issue 4
April 2018
283 pages

Publisher

IEEE Educational Activities Department

United States

Publication History

Published: 01 April 2018

Qualifiers

  • Research-article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 21 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2024)AdapTUI: Adaptation of Geometric-Feature-Based Tangible User Interfaces in Augmented RealityProceedings of the ACM on Human-Computer Interaction10.1145/36981278:ISS(44-69)Online publication date: 24-Oct-2024
  • (2024)MoiréTag: A Low-Cost Tag for High-Precision Tangible Interactions without Active ComponentsProceedings of the ACM on Human-Computer Interaction10.1145/36981138:ISS(1-19)Online publication date: 24-Oct-2024
  • (2024)SoundScroll: Robust Finger Slide Detection Using Friction Sound and Wrist-Worn MicrophonesProceedings of the 2024 ACM International Symposium on Wearable Computers10.1145/3675095.3676614(63-70)Online publication date: 5-Oct-2024
  • (2024)VirtualNexus: Enhancing 360-Degree Video AR/VR Collaboration with Environment Cutouts and Virtual ReplicasProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676377(1-12)Online publication date: 13-Oct-2024
  • (2024)TouchInsight: Uncertainty-aware Rapid Touch and Text Input for Mixed Reality from Egocentric VisionProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676330(1-16)Online publication date: 13-Oct-2024
  • (2024)Exploiting Physical Referent Features as Input for Multidimensional Data Selection in Augmented RealityACM Transactions on Computer-Human Interaction10.1145/364861331:4(1-40)Online publication date: 19-Sep-2024
  • (2024)Enhancing VR Sketching with a Dynamic Shape DisplayProceedings of the 30th ACM Symposium on Virtual Reality Software and Technology10.1145/3641825.3687714(1-11)Online publication date: 9-Oct-2024
  • (2024)Adaptive 3D UI Placement in Mixed Reality Using Deep Reinforcement LearningExtended Abstracts of the CHI Conference on Human Factors in Computing Systems10.1145/3613905.3651059(1-7)Online publication date: 11-May-2024
  • (2024)PhoneInVR: An Evaluation of Spatial Anchoring and Interaction Techniques for Smartphone Usage in Virtual RealityProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642582(1-16)Online publication date: 11-May-2024
  • (2024)TriPad: Touch Input in AR on Ordinary Surfaces with Hand Tracking OnlyProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642323(1-18)Online publication date: 11-May-2024
  • Show More Cited By

View Options

View options

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media