Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/2702123.2702214acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

User-Defined Game Input for Smart Glasses in Public Space

Published: 18 April 2015 Publication History

Abstract

Smart glasses, such as Google Glass, provide always-available displays not offered by console and mobile gaming devices, and could potentially offer a pervasive gaming experience. However, research on input for games on smart glasses has been constrained by the available sensors to date. To help inform design directions, this paper explores user-defined game input for smart glasses beyond the capabilities of current sensors, and focuses on the interaction in public settings. We conducted a user-defined input study with 24 participants, each performing 17 common game control tasks using 3 classes of interaction and 2 form factors of smart glasses, for a total of 2448 trials. Results show that users significantly preferred non-touch and non-handheld interaction over using handheld input devices, such as in-air gestures. Also, for touch input without handheld devices, users preferred interacting with their palms over wearable devices (51% vs 20%). In addition, users preferred interactions that are less noticeable due to concerns with social acceptance, and preferred in-air gestures in front of the torso rather than in front of the face (63% vs 37%).

References

[1]
Aigner, R., Wigdor, D., Benko, H., Haller, M., Lindbauer, D., Ion, A., Zhao, S., and Koh, J. Understanding mid-air hand gestures: A study of human preferences in usage of gesture types for hci. Microsoft Research TechReport MSR-TR-2012--111 (2012).
[2]
App Annie. http://www.appannie.com.
[3]
Baba, T., Ushiama, T., Tsuruno, R., and Tomimatsu, K. Video game that uses skin contact as controller input. SIGGRAPH '07,
[4]
Biskupski, A., Fender, A. R., Feuchtner, T. M., Karsten, M., and Willaredt, J. D. Drunken ed: A balance game for public large screen displays. CHI EA '14, 289--292.
[5]
Christian, S., Alves, J., Ferreira, A., Jesus, D., Freitas, R., and Vieira, N. Volcano salvation: Interaction through gesture and head tracking. CHI EA '14, 297--300.
[6]
Colacó, A., Kirmani, A., Yang, H. S., Gong, N.-W., Schmandt, C., and Goyal, V. K. Mime: Compact, low power 3d gesture sensing for interaction with head mounted displays. UIST '13, 227--236.
[7]
Epps, J., Lichman, S., and Wu, M. A study of hand shape use in tabletop gesture interaction. CHI EA '06, 748--753.
[8]
Epson BT-100 Specs. http://www.epson.com.au/products/ ProjectorAccessories/Moverio_BT-100_specs.asp.
[9]
Essential Facts About The Computer And Video Game Industry. http: //www.theesa.com/facts/pdfs/esa_ef_2014.pdf.
[10]
Findlater, L., Lee, B., and Wobbrock, J. Beyond qwerty: Augmenting touch screen keyboards with multi-touch gestures for non-alphanumeric input. CHI '12, pp. 2679--2682.
[11]
GameStop. http://www.gamestop.com.
[12]
Google Glass wiki. http: //en.wikipedia.org/wiki/Google_Glass#Features.
[13]
Grijincu, D., Nacenta, M. A., and Kristensson, P. O. User-defined interface gestures: Dataset and analysis. ITS '14, 25--34.
[14]
Gustafson, S., Holz, C., and Baudisch, P. Imaginary phone: Learning imaginary interfaces by transferring spatial memory from a familiar device. UIST '11, 283--292.
[15]
Harada, S., Wobbrock, J. O., and Landay, J. A. Voice games: Investigation into the use of non-speech voice input for making computer games more accessible. INTERACT'11, pp. 11--29.
[16]
Harrison, C., Benko, H., and Wilson, A. D. Omnitouch: Wearable multitouch interaction everywhere. UIST '11, 441--450.
[17]
Harrison, C., Tan, D., and Morris, D. Skinput: Appropriating the body as an input surface. CHI '10, 453--462.
[18]
Hsu, C.-Y., Tung, Y.-C., Wang, H.-Y., Chyou, S., Lin, J.-W., and Chen, M. Y. Glass shooter: Exploring first-person shooter game control with google glass. ICMI '14, 70--71.
[19]
Jing, L., Cheng, Z., Zhou, Y., Wang, J., and Huang, T. Magic ring: A self-contained gesture input device on finger. MUM '13, 39:1--39:4.
[20]
Jurgelionis, A., Nap, H. H., Gajadhar, B. J., Bellotti, F., Wang, A. I., and Berta, R. Player experience and technical performance prospects for distributed 3d gaming in private and public settings. Comput. Entertain., 16:1--16:19.
[21]
Cohen's kappa - Wikipedia, the free encyclopedia. http://en.wikipedia.org/wiki/Cohen's_kappa.
[22]
Karam, M., et al. A taxonomy of gestures in human computer interactions.
[23]
Kim, D., Hilliges, O., Izadi, S., Butler, A. D., Chen, J., Oikonomidis, I., and Olivier, P. Digits: Freehand 3d interactions anywhere using a wrist-worn gloveless sensor. UIST '12, 167--176.
[24]
Liang, H.-N., Williams, C., Semegen, M., Stuerzlinger, W., and Irani, P. User-defined surface+motion gestures for 3d manipulation of objects at a distance through a mobile device. APCHI '12, 299--308.
[25]
Google Glass Mini Games. https://developers. google.com/glass/samples/mini-games.
[26]
Montero, C. S., Alexander, J., Marshall, M. T., and Subramanian, S. Would you do that?: Understanding social acceptance of gestural interfaces. MobileHCI '10, 275--278.
[27]
Morris, M. R. Web on the wall: Insights from a multimodal interaction elicitation study. ITS '12, 95--104.
[28]
Nacke, L. E., Kalyn, M., Lough, C., and Mandryk, R. L. Biofeedback game design: Using direct and indirect physiological control to enhance game interaction. CHI '11, 103--112.
[29]
Nielsen, M., Störring, M., Moeslund, T. B., and Granum, E. A procedure for developing intuitive and ergonomic gesture interfaces for hci. In Gesture-Based Communication in Human-Computer Interaction. Springer, 2004, 409--420.
[30]
Piumsomboon, T., Clark, A., Billinghurst, M., and Cockburn, A. User-defined gestures for augmented reality. CHI'13, 955--960.
[31]
Pyryeskin, D., Hancock, M., and Hoey, J. Comparing elicited gestures to designer-created gestures for selection above a multitouch surface. ITS '12, 1--10.
[32]
Reis, S. Expanding the magic circle in pervasive casual play. ICEC'12, 486--489.
[33]
Serrano, M., Ens, B. M., and Irani, P. P. Exploring the use of hand-to-face input for interacting with head-worn displays. CHI '14, 3181--3190.
[34]
Sporka, A. J., Kurniawan, S. H., Mahmud, M., and Slavík, P. Non-speech input and speech recognition for real-time control of computer games. Assets '06, 213--220.
[35]
Steam. http://store.steampowered.com.
[36]
Top 90 Casual Games List. http://ppt.cc/hO2j, Analyzed at 2014-08--14.
[37]
VGChartz. http://www.vgchartz.com.
[38]
Vickers, S., Istance, H., and Hyrskykari, A. Performing locomotion tasks in immersive computer games with an adapted eye-tracking interface. ACM Trans. Access. Comput. 5, 1 (Sept. 2013), 2:1--2:33.
[39]
Williamson, J. R., Brewster, S., and Vennelakanti, R. Mo!games: Evaluating mobile gestures in the wild. ICMI '13, 173--180.
[40]
Wobbrock, J. O., Aung, H. H., Rothrock, B., and Myers, B. A. Maximizing the guessability of symbolic input. CHI EA '05, 1869--1872.
[41]
Wobbrock, J. O., Morris, M. R., and Wilson, A. D. User-defined gestures for surface computing. CHI '09, pp. 1083--1092.

Cited By

View all
  • (2024)Comparison of Unencumbered Interaction Technique for Head-Mounted DisplaysProceedings of the ACM on Human-Computer Interaction10.1145/36981468:ISS(500-516)Online publication date: 24-Oct-2024
  • (2024)GraV: Grasp Volume Data for the Design of One-Handed XR InterfacesProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661567(151-167)Online publication date: 1-Jul-2024
  • (2024)Exploring Mobile Devices as Haptic Interfaces for Mixed RealityProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642176(1-17)Online publication date: 11-May-2024
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
CHI '15: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems
April 2015
4290 pages
ISBN:9781450331456
DOI:10.1145/2702123
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 18 April 2015

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. control
  2. game
  3. guessability
  4. input
  5. pervasive gaming
  6. public space
  7. smart glasses
  8. user-defined
  9. wearable

Qualifiers

  • Research-article

Conference

CHI '15
Sponsor:
CHI '15: CHI Conference on Human Factors in Computing Systems
April 18 - 23, 2015
Seoul, Republic of Korea

Acceptance Rates

CHI '15 Paper Acceptance Rate 486 of 2,120 submissions, 23%;
Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

Upcoming Conference

CHI '25
CHI Conference on Human Factors in Computing Systems
April 26 - May 1, 2025
Yokohama , Japan

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)99
  • Downloads (Last 6 weeks)12
Reflects downloads up to 22 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Comparison of Unencumbered Interaction Technique for Head-Mounted DisplaysProceedings of the ACM on Human-Computer Interaction10.1145/36981468:ISS(500-516)Online publication date: 24-Oct-2024
  • (2024)GraV: Grasp Volume Data for the Design of One-Handed XR InterfacesProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661567(151-167)Online publication date: 1-Jul-2024
  • (2024)Exploring Mobile Devices as Haptic Interfaces for Mixed RealityProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642176(1-17)Online publication date: 11-May-2024
  • (2024)Elicitation and Evaluation of Hand-based Interaction Language for 3D Conceptual Design in Mixed RealityInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2023.103198183:COnline publication date: 14-Mar-2024
  • (2024)A natural bare-hand interface-enabled interactive AR assembly guidanceThe International Journal of Advanced Manufacturing Technology10.1007/s00170-024-13922-z133:7-8(3193-3207)Online publication date: 12-Jun-2024
  • (2024)Big Movements or Small Motions: Controlling Digital Avatars with Single-Camera Motion CaptureDesign, User Experience, and Usability10.1007/978-3-031-61356-2_9(130-148)Online publication date: 29-Jun-2024
  • (2023)Interpretative Structural Modeling Analyzes the Hierarchical Relationship between Mid-Air Gestures and Interaction SatisfactionApplied Sciences10.3390/app1305312913:5(3129)Online publication date: 28-Feb-2023
  • (2023)Brave New GES World: A Systematic Literature Review of Gestures and Referents in Gesture Elicitation StudiesACM Computing Surveys10.1145/363645856:5(1-55)Online publication date: 7-Dec-2023
  • (2023)Surveying the Social Comfort of Body, Device, and Environment-Based Augmented Reality Interactions in Confined Passenger Spaces Using Mixed Reality Composite VideosProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36109237:3(1-25)Online publication date: 27-Sep-2023
  • (2023)Exploring Gaze-assisted and Hand-based Region Selection in Augmented RealityProceedings of the ACM on Human-Computer Interaction10.1145/35911297:ETRA(1-19)Online publication date: 18-May-2023
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media