Move, Hold and Touch: A Framework for Tangible Gesture Interactive Systems
"> Figure 1
<p>The Tangible Gesture Interaction Framework (TGIF) syntax of tangible gestures: a gesture based on optional move, hold and touch components, related to one object.</p> "> Figure 2
<p>The communication model of tangible gesture interaction. A user performs a tangible gesture, which is a sign with an associated meaning. The computer (and possibly the other users) interprets this sign and acknowledges the user with feedback.</p> "> Figure 3
<p>Taxonomy of move, hold and touch combinations.</p> "> Figure 4
<p>Map of tangible gesture semantic constructs.</p> "> Figure 5
<p>TGI design process.</p> "> Figure 6
<p>The four gestures for the wearable WheelSense system.</p> "> Figure 7
<p>The four gestures for the embedded WheelSense system based on pressure sensors.</p> "> Figure 8
<p>Hybrid WheelSense System and the five gestures (top): fist squeeze, wrist extension, wrist flexion, hand push, index tap.</p> "> Figure 9
<p>User-elicited gesture taxonomy for the WheelSense system.</p> "> Figure 10
<p>The six gestures for the embedded WheelSense system based on capacitive sensors: hand tap in the top/bottom/right side/left side, and hand swipe up/down.</p> "> Figure 11
<p>The five gestures of the ADA lamp.</p> "> Figure 12
<p>The hug gesture for the Hugginess system.</p> "> Figure 13
<p>The 4 objects of the user evaluation (<b>left</b>), the fake prototype (<b>right</b>).</p> ">
Abstract
:1. Introduction
2. Related Work
2.1. Tangible Interaction
2.2. Gestural Interaction
2.3. Gestures with Objects
2.4. Proposed Contribution
3. TGIF: Abstracting on Tangible Gestures
3.1. TGIF Syntax: Touch, Hold and Move
3.1.1. Gesture Delimitation in Time
3.1.2. Move, Hold and Touch: Single Gestures
3.1.3. Hold + Touch
3.1.4. Hold + Move
3.1.5. Touch + Move
3.1.6. Move + Hold + Touch
3.1.7. Extension to More than One Object and Full Body Interaction
3.2. TGIF Semantics: Meanings of Objects and Gestures
3.3. Classification of TGI Systems
System | Application | Tech Approach | Object (Affordance) | Gesture | Syntax | ||
---|---|---|---|---|---|---|---|
M | H | T | |||||
TabletopCars [56] | Gaming | E | Car toy | Move | x | ||
Rosebud [37] | Storytelling | E | Doll or toy | Hold | x | ||
Video bubbles [39] | Storytelling | EE | Big bubble (deformable) | Hold&Pressure (Squeezing the bubble) | x | ||
Reading Glove [57] | Storytelling | W | Any object | Hold | x | ||
Pasquero et al.’s Watch [38] | Control | EE | Watch | Touch | touch + move | x | ||
Picture this! [24] | Storytelling | EE | Doll | Hold + move | x | x | |
Gesture Sticker [44] | Control | EE | Any object | Hold + move | x | x | |
Pen rolling [45] | Individual production | E | Pen | Hold + move (rolling the pen) | x | x | |
SplashController [58] | Gaming | EE | Different types of water containers | Hold + move | x | x | |
MoSo Tangibles [54] | Individual production | EE | Various artifacts | Hold + move | x | x | |
Graspables [42] | Control/Gaming | EE | “bar of soap” | Hold + touch | x | x | |
Ashbrook et al.’s watch [46] | Control | EE | Watch | Touch + move | x | x | |
Tzee [48] | Collaborative production | E | TZee pyramid | Touch + move | x | x | |
Spinning in control [59] | Control | EE | Remote Controller (moving parts) | Hold + (Touch + move) | x | x | |
Hapticat [60] | Emotional design | EE | Cat (zoomorphic) | Hold + move | Touch + move | hold&Pressure | x | x | x |
MTPen [40] | Individual production | EE | Pen | Hold + Touch | Toch + move | x | x | x |
Tickle [61] | Control | W | Any handheld device | Hold + (touch + move) | x | x | x |
FoldMe [62] | Control | EE | Foldable display (deformable) | Hold + move (Hold + move) + touch | x | x | x |
Morganti et al.’s watch [63] | Control | W | Any graspable object | Hold | Hold + move | x | x | x |
dSensingNI [64] | Control/Collaborative production | E | Many objects in the environment | Touch | Touch + move | Hold + move | x | x | x |
WheelSense [49] | Control | EE | Steering Wheel (ergonomics) | Hold + touch&Pressure | Hold + touch + move | x | x | x |
PaperPhone [50] | Control | EE | Flexible Phone (deformable) | Hold + touch + move (bending) | x | x | x |
4. Engineering TGI Systems
4.1. Designing Tangible Gesture Interaction
4.1.1. Common Practices for Popular Application Domains
4.1.2. Object Affordances
4.2. Building TGI Systems
5. Design of Four TGI Systems
5.1. WheelSense
5.2. ADA Lamp
5.3. Hugginess
5.4. Smart Watch
6. Discussion
6.1. Descriptive Power
6.2. Evaluative Power
- Integrated control: The emotional state of the ADA lamp can be controlled performing gestures on the surface of the lamp. Sensors are also integrated inside the lamp to recognize these gestures.
- Integrated representation: The lamp emotional state is represented by its facial expressions, which are generated through RGB LEDs integrated in the lamp.
- Direct control: The lamp state cannot be controlled directly through tangible gestures. Indeed, although a deterministic state machine is used to describe the lamp behavior, we avoided to obtain direct reactions to user’s gestures. The almost unpredictable reactions are intended to create a life-like behavior that should foster long-lasting interactions.
- Direct representation: The representation of the lamp emotional state is direct and is coded through specific facial expressions and colors.
- Meaningful control: The gestures used to control the lamp are those typically used to interact with humans. Therefore, the meaning and emotional valence of gestures are consistent with the users’ social habits.
- Meaningful representation: The representation of the lamp emotional state through facial expressions is meaningful and generally can be understood easily by the users. Obviously, the mapping between colors and emotions according to Plutchick’s wheel of emotion is not universal and the color mapping might not be meaningful for some users.
6.3. Generative Power
System | Application | Tech Approach | Gesture Design Approach | Object (Affordance) | Gesture | Syntax | ||
---|---|---|---|---|---|---|---|---|
M | H | T | ||||||
WheelSense V1 | Control | W | Hold constraint—technology driven | Steering wheel | Hold + touch&Pressure | Hold + touch + move | x | x | x |
WheelSense V2 | Control | EE | Hold constraint—technology driven | Steering wheel | Hold + touch&Pressure | Hold + touch + move | x | x | x |
WheelSense V3 | Control | W + EE | Hold constraint—technology driven | Steering wheel | Hold + touch&Pressure | Hold + touch + move | x | x | x |
WheelSense V4 | Control | EE | User—Driven (gesture elicitation) | Steering wheel | Touch | Touch + move | x | x | |
ADA Lamp | Affective communication | EE | Common practices + object affordance | Lamp (human head) | Touch | touch + move | Hold + touch | x | x | x |
Hugginess | Affective communication | EE/W | Common practices + object affordance | Human Body | Hold + touch&Pressure | x | x | |
Smart Watch | Control | W | Technology driven + user observation study | Different small objects (can be held in the hand) | Hold + move | x | x |
7. Limitations
8. Conclusions
Acknowledgments
Author Contributions
Conflicts of Interest
References
- Weiser, M. The computer for the 21st century. Sci. Am. 1991, 265, 94–104. [Google Scholar] [CrossRef]
- Jacob, R.J.K.; Girouard, A.; Hirshfield, L.M.; Horn, M.S.; Shaer, O.; Solovey, E.T.; Zigelbaum, J. Reality-based interaction: A framework for post-wimp interfaces. In Proceedings of the CHI‘08, Florence, Italy, 5–10 April 2008; pp. 201–210.
- Bolt, R.A. “Put-that-there”: Voice and gesture at the graphics interface. In SIGGRAPH ‘80: Proceedings of the 7th annual conference on Computer graphics and interactive techniques, Seattle, Washington, USA, 14–18 July 1980; pp. 262–270.
- Feldman, A.; Tapia, E.M.; Sadi, S.; Maes, P.; Schmandt, C. Reachmedia: On-the-move interaction with everyday objects. In Proceedings of the Ninth IEEE International Symposium on Wearable Computers (ISWC‘05), Osaka, Japan, 18–21 October 2005; pp. 52–59.
- Fitzmaurice, G. Graspable user interfaces. University of Toronto: Canada, 1996. Available online: http://www.dgp.toronto.edu/~gf/papers/Thesis.gf.final.pdf (accessed on 15 June 2015).
- Fishkin, K. A taxonomy for and analysis of tangible interfaces. Pers. Ubiquitous Comput. 2004, 8, 347–358. [Google Scholar] [CrossRef]
- Ishii, H.; Lakatos, D.; Bonanni, L.; Labrune, J.-B.J. Radical atoms : Beyond tangible bits , toward transformable materials. Interactions 2012, XIX, 38–51. [Google Scholar] [CrossRef]
- Shaer, O.; Leland, N.; Calvillo-Gamez, E.; Jacob, R.K. The tac paradigm: Specifying tangible user interfaces. Pers. Ubiquitous Comput. 2004, 8, 359–369. [Google Scholar] [CrossRef]
- Ullmer, B.; Ishii, H. Emerging frameworks for tangible user interfaces. IBM Syst. J. 2000, 39, 915–931. [Google Scholar] [CrossRef]
- van den Hoven, E.; Mazalek, A. Grasping gestures: Gesturing with physical artifacts. Artif. Intell. Eng. Des. Anal. Manuf. 2011, 25, 255–271. [Google Scholar] [CrossRef]
- Mazalek, A.; van den Hoven, E. Framing tangible interaction frameworks. Artif. Intell. Eng. Des. Anal. Manuf. 2009, 23, 225–235. [Google Scholar] [CrossRef]
- Wellner, P.; Mackay, W.; Gold, R. Computer-augmented environments: Back to the real world. Commun. ACM 1993, 36, 24–26. [Google Scholar] [CrossRef]
- Hornecker, E.; Buur, J. Getting a grip on tangible interaction: A framework on physical space and social interaction. In Proceedings of the SIGCHI conference on Human Factors in computing systems, Montréal, Québec, Canada, 22–27 April 2006; pp. 437–446.
- Van Den Hoven, E.; Van De Garde-Perik, E.; Offermans, S.; Van Boerdonk, K.; Lenssen, K.M.H. Moving tangible interaction systems to the next level. Computer 2013, 46, 70–76. [Google Scholar] [CrossRef]
- Djajadiningrat, T.; Matthews, B.; Stienstra, M. Easy doesn’t do it: Skill and expression in tangible aesthetics. Pers. Ubiquitous Comput. 2007, 11, 657–676. [Google Scholar] [CrossRef] [Green Version]
- Wensveen, S.A.G.; Djajadiningrat, J.P.; Overbeeke, C.J. Interaction frogger: A design framework to couple action and function through feedback and feedforward. In DIS ‘04 Proceedings of the 5th conference on Designing interactive systems: processes, practices, methods, and techniques, Cambridge, Massachusetts, USA, 1–4 August 2004; pp. 177–184.
- Gentilucci, M.; Corballis, M.C. From manual gesture to speech: A gradual transition. Neurosci. Biobehav. Rev. 2006, 30, 949–960. [Google Scholar] [CrossRef] [PubMed]
- Quek, F.; Mcneill, D.; Bryll, R.; Duncan, S.; Ma, X.-F.; Kirbas, C.; Mccullough, K.E.; Ansari, R. Multimodal human discourse: Gesture and speech. ACM Transac. Comput.-Hum. Interact. 2002, 9, 171–193. [Google Scholar] [CrossRef]
- Quek, F.K. Eyes in the interface. Image Vis. Comput. 1995, 13, 511–525. [Google Scholar] [CrossRef]
- Karam, M. A Framework for Research and Design of Gesture-Based Human Computer Interactions. Ph.D. Thesis, University of Southampton, Southampton, UK, 2006. [Google Scholar]
- Baudel, T.; Beaudouin-Lafon, M. Charade: Remote control of objects using free-hand gestures. Commun. ACM 1993, 36, 28–35. [Google Scholar] [CrossRef]
- Kammer, D.; Wojdziak, J.; Keck, M.; Groh, R.; Taranko, S. Towards a formalization of multi-touch gestures. In ACM International Conference on Interactive Tabletops and Surfaces—ITS ’10, Saarbrücken, Germany, 7–10 November 2010; pp. 49–58.
- Golod, I.; Heidrich, F.; Möllering, C.; Ziefle, M. Design principles of hand gesture interfaces for microinteractions. In Proceedings of the 6th International Conference on Designing Pleasurable Products and Interfaces—DPPI ‘13, Newcastle upon Tyne, UK, 3–5 September 2013; pp. 11–20.
- Vaucelle, C.; Ishii, H. Picture this! Film assembly using toy gestures. In Proceedings of the 10th international conference on Ubiquitous computing, Seoul, Korea, 21–24 September 2008; Volume 8, pp. 350–359.
- Wimmer, R. Grasp sensing for human-computer interaction. In Proceedings of the fifth international conference on Tangible, embedded, and embodied interaction, Funchal, Portugal, 22–26 January 2011; pp. 221–228.
- Wolf, K.; Naumann, A.; Rohs, M.; Müller, J. A taxonomy of microinteractions: Defining microgestures based on ergonomic and scenario-dependent requirements. In Proceedings of the INTERACT 2011, Lisbon, Portugal, 5–9 September 2011; pp. 559–575.
- Valdes, C.; Eastman, D.; Grote, C.; Thatte, S.; Shaer, O.; Mazalek, A.; Ullmer, B.; Konkel, M.K. Exploring the design space of gestural interaction with active tokens through user-defined gestures. In Proceedings of the 32nd annual ACM conference on Human factors in computing systems, Toronto, ON, Canada, 26 April–1 May 2014; pp. 4107–4116.
- De Saussure, F. Course in general linguistics; Columbia University Press: New York, NY, USA, 2011. [Google Scholar]
- Bühler, K. Sprachtheorie; Fischer: Oxford, UK, 1934. [Google Scholar]
- DeSouza, C.S. The Semiotic Engineering of Human-Computer Interaction; The MIT Press: Cambridge, MA, USA, 2005; p. 312. [Google Scholar]
- Matthews, B. Grammar, meaning and movement-based interaction. In Proceedings of the OZCHI ‘06, Sydney, Australia, 20–24 November 2006; pp. 405–408.
- Price, S.; Rogers, Y. Let’s get physical: The learning benefits of interacting in digitally augmented physical spaces. Comput. Educ. 2004, 43, 137–151. [Google Scholar] [CrossRef]
- Fishkin, K.P.; Gujar, A.; Mochon, C.; Want, R. Squeeze me, hold me, tilt me ! An exploration of manipulative user interfaces. In Proceedings of the CHI ‘98, Los Angeles, CA, USA, 18–23 April 1998; pp. 17–24.
- Wimmer, R.; Boring, S. Handsense: Discriminating different ways of grasping and holding a tangible user interface. In Proceedings of the 3rd International Conference on Tangible and Embedded Interaction, Cambridge, UK, 16–18 February 2009; pp. 359–362.
- Wobbrock, J.O.; Morris, M.R.; Wilson, A.D. User-defined gestures for surface computing. In Proceedings of the 27th international conference on Human factors in computing systems—CHI 09, Boston, MA, USA, 4–9 April 2009; pp. 1083–1092.
- Hale, K.S.K.S.; Stanney, K.M.K.M. Deriving haptic design guidelines from human physiological, psychophysical, and neurological foundations. Compu. Graph. App. Lications IEEE 2004, 24, 33–39. [Google Scholar] [CrossRef]
- Glos, J.W.; Cassell, J. Rosebud: Technological toys for storytelling. In Proceedings of the CHI’97 extended abstracts on Human factors in computing systems: Looking to the future, Atlanta, GA, USA, 22–27 March 1997; pp. 359–360.
- Pasquero, J.; Stobbe, S.J.; Stonehouse, N. A haptic wristwatch for eyes-free interactions. In Proceedings of the 2011 annual conference on Human factors in computing systems, Vancouver, BC, Canada, 7–12 May 2011; pp. 3257–3266.
- Ryokai, K.; Raffle, H.; Horii, H.; Mann, Y. Tangible video bubbles. In Proceedings of the 28th of the international conference extended abstracts on Human factors in computing systems, Atlanta, GA, USA, 10–15 April 2010; pp. 2775–2784.
- Song, H.; Benko, H.; Guimbretiere, F.; Izadi, S.; Cao, X.; Hinckley, K. Grips and gestures on a multi-touch pen. In Proceedings of the 2011 annual conference on Human factors in computing systems, Vancouver, BC, Canada, 7–12 May 2011; pp. 1323–1332.
- Feix, T. Human grasping database. Available online: http://grasp.xief.net/ (accessed 15 June 2015).
- Taylor, B.; Bove, V.M. Graspables: Grasp-recognition as a user interface. In Proceedings of the CHI’09, Boston, MA, USA, 4–9 April 2009; pp. 917–925.
- Ferscha, A.; Resmerita, S.; Holzmann, C.; Reicho, M. Orientation sensing for gesture-based interaction with smart artifacts. Comput. Commun. 2005, 28, 1552–1563. [Google Scholar] [CrossRef]
- Atia, A.; Takahashi, S.; Tanaka, J. Smart gesture sticker: Smart hand gestures profiles for daily objects interaction. In Proceedings of 2010 IEEE/ACIS 9th International Conference on Computer and Information Science, Yamagata, Japan, 18–20 August 2010; pp. 482–487.
- Bi, X.; Moscovich, T.; Ramos, G.; Balakrishnan, R.; Hinckley, K. An exploration of pen rolling for pen-based interaction. In Proceedings of the 21st annual ACM symposium on User interface software and technology—UIST ‘08, Monterey, CA, USA, 19–22 October 2008; pp. 191–200.
- Ashbrook, D.; Lyons, K.; Starner, T. An investigation into round touchscreen wristwatch interaction. In Proceedings of the 10th international conference on Human computer interaction with mobile devices and services—MobileHCI ‘08, Amsterdam, The Netherlands, 2–5 September 2008; pp. 311–314.
- Perrault, S.; Lecolinet, E.; Eagan, J.; Guiard, Y. Watchit: Simple gestures and eyes-free interaction for wristwatches and bracelets. In Proceedings of the CHI‘13, Paris, France, 27 April–2 May 2013; pp. 1–10.
- Williams, C.; Yang, X.D.; Partridge, G.; Millar-Usiskin, J.; Major, A.; Irani, P. Tzee: Exploiting the lighting properties of multi-touch tabletops for tangible 3d interactions. In Proceedings of the 2011 annual conference on Human factors in computing systems—CHI ‘11, Vancouver, BC, Canada, 7–12 May 2011; pp. 1363–1372.
- Angelini, L.; Caon, M.; Carrino, F.; Carrino, S.; Lalanne, D.; Abou Khaled, O.; Mugellini, E. Wheelsense: Enabling tangible gestures on the steering wheel for in-car natural interaction. In Proceedings of the HCII‘13, Las Vegas, NV, USA, 21–26 July 2013; pp. 531–540.
- Lahey, B.; Girouard, A.; Burleson, W.; Vertegaal, R. Paperphone: Understanding the use of bend gestures in mobile devices with flexible electronic paper displays. In Proceedings of the CHI, Vancouver, BC, Canada, 7–12 May 2011; pp. 1303–1312.
- Dourish, P. Where the Action is; MIT Press: Cambridge, MA, USA, 2004; p. 233. [Google Scholar]
- Hornecker, E. Beyond affordance: Tangibles’ hybrid nature. In Proceedings of the Sixth International Conference on Tangible, Embedded and Embodied Interaction, Kingston, ON, Canada, 19–22 February 2012; pp. 175–182.
- Koleva, B.; Benford, S.; Hui Ng, K.; Rodden, T. A framework for tangible user interfaces. In Proceedings of the Physical Interaction (PI03)—Workshop on RealWorld User Interfaces, Udine, Italy, 8 September 2003; pp. 46–50.
- Bakker, S.; Antle, A.N.; van den Hoven, E. Embodied metaphors in tangible interaction design. Pers. Ubiquitous Comput. 2011, 16, 433–449. [Google Scholar] [CrossRef]
- van den Hoven, E.; Eggen, B. Tangible computing in everyday life: Extending current frameworks for tangible user interfaces with personal objects. Ambient Intell. 2004, 3295, 230–242. [Google Scholar]
- Dang, C.T.; André, E. Tabletopcars-interaction with active tangible remote controlled cars. In Proceedings of the TEI‘13, Barcelona, Spain, 10–13 February 2013; pp. 33–40.
- Tanenbaum, J.; Tanenbaum, K.; Antle, A. The reading glove : Designing interactions for object-based tangible storytelling. In Proceedings of the AH‘10, Megève, France, 2–3 April 2010; pp. 1–9.
- Geurts, L.; Abeele, V.V. Splash controllers: Game controllers involving the uncareful manipulation of water. In Proceedings of the TEI‘12, Kingston, ON, Canada, 19–22 February 2012; Volume 1, pp. 183–186.
- Kimman, F.; Weda, H.; van den Hoven, E.; de Zeeuw, T.; Luitjens, S. Spinning in control: Design exploration of a cross-device remote. In Proceedings of the TEI’11, Funchal, Portugal, 22–26 January 2011; pp. 189–192.
- Yohanan, S.; Chan, M.; Hopkins, J.; Sun, H.; Maclean, K. Hapticat: Exploration of affective touch. In Proceedings of the ICMI‘05, Trento, Italy, 4–6 October 2005; pp. 222–229.
- Wolf, K.; Schleicher, R.; Kratz, S.; Rohs, M. Tickle : A surface-independent interaction technique for grasp interfaces. In Proceedings of the TEI‘13, Barcelona, Spain, 10–13 February 2013; pp. 185–192.
- Lissermann, R. Foldme : Interacting with double-sided foldable displays. In Proceedings of the TEI‘12, Kingston, ON, Canada, 19–22 February 2012; Volume 1, pp. 33–40.
- Morganti, E.; Angelini, L.; Adami, A.; Lalanne, D.; Lorenzelli, L.; Mugellini, E. A smart watch with embedded sensors to recognize objects, grasps and forearm gestures. Procedia Engineering, International Symposium on Robotics and Intelligent Sensors 2012, Kuching, Malaysia, 4–6 September 2012, 1169–1175.
- Klompmaker, F.; Nebe, K.; Fast, A. Dsensingni: A framework for advanced tangible interaction using a depth camera. In Proceedings of the Sixth International Conference on Tangible, Embedded and Embodied Interaction, Kingston, ON, Canada, 19–22 February 2012; pp. 217–224.
- Morris, M.R.; Danielescu, A.; Drucker, S.; Fisher, D.; Lee, B.; schraefel, M.C.; Wobbrock, J.O. Reducing legacy bias in gesture elicitation studies. Interactions 2014, 21, 40–45. [Google Scholar] [CrossRef]
- Keng, J.; Teh, S.; Cheok, A.D. Huggy pajama : A mobile parent and child hugging communication system. In Proceedings of the IDC‘09, Chicago, IL, USA, 11–13 June 2008; pp. 250–257.
- Jordà, S.; Geiger, G.; Alonso, M.; Kaltenbrunner, M. The reactable : Exploring the synergy between live music performance and tabletop tangible interfaces. In TEI ‘07 Proceedings of the 1st International Conference on Tangible and Embedded Interaction, Baton Rouge, LA, USA, 15–17 February 2007; pp. 139–146.
- Radford, L. Why do gestures matter? Sensuous cognition and the palpability of mathematical meanings. Educ. Stud. Math. 2008, 70, 111–126. [Google Scholar] [CrossRef]
- Schmitz, M. Concepts for life-like interactive objects. In Proceedings of the fifth international conference on Tangible, embedded, and embodied interaction, Funchal, Portugal, 22–26 January 2011; pp. 157–164.
- Lopes, P.; Jonell, P.; Baudisch, P. Affordance++: Allowing objects to communicate dynamic use. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, Seoul, Korea, 18–23 April 2015; pp. 2515–2524.
- Sato, M.; Poupyrev, I.; Harrison, C. Touché: Enhancing touch interaction on humans, screens, liquids, and everyday objects. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ‘12), Austin, TX, USA, 5–10 May 2012; pp. 483–492.
- Wimmer, R. Flyeye: Grasp-sensitive surfaces using optical fiber. In Proceedings of the fourth international conference on Tangible, embedded, and embodied interaction—TEI ‘10, Cambridge, MA, USA, 24–27 January 2010; pp. 245–248.
- Ketabdar, H.; Moghadam, P.; Roshandel, M. Pingu: A new miniature wearable device for ubiquitous computing environments. In Proceedings of 6th International Conference on Complex, Intelligent, and Software Intensive Systems, CISIS 2012, Palermo, Italy, 4–6 July 2012; pp. 502–506.
- Li, Z.; Wachsmuth, S.; Fritsch, J.; Sagerer, G. View-adaptive manipulative action recognition for robot companions. In Proceedings of 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems, San Diego, CA, USA, 29 October–2 November 2007; pp. 1028–1033.
- Angelini, L.; Baumgartner, J.; Carrino, F.; Carrino, S.; Khaled, O.A.; Sauer, J.; Lalanne, D.; Sonderegger, A.; Mugellini, E. Gesturing on the steering wheel, a comparison with speech and touch interaction modalities. Technical Report 15-03. Department of Informatics, University of Fribourg: Fribourg, Switzerland, 2015. [Google Scholar]
- Angelini, L.; Caon, M.; Lalanne, D.; Abou khaled, O.; Mugellini, E. Towards an anthropomorphic lamp for affective interaction. In Proceedings of the Ninth International Conference on Tangible, Embedded, and Embodied Interaction—TEI ‘14, Stanford, CA, USA, 16–19 January 2015; pp. 661–666.
- Angelini, L.; Carrino, F.; Carrino, S.; Caon, M.; Khaled, O.A.; Baumgartner, J.; Sonderegger, A.; Lalanne, D.; Mugellini, E. Gesturing on the steering wheel: A user-elicited taxonomy. In Proceedings of the 6th International Conference on Automotive User Interfaces and Interactive Vehicular Applications—AutomotiveUI ‘14, Seattle, WA, USA, 17–19 September 2014; pp. 1–8.
- Angelini, L.; Carrino, F.; Carrino, S.; Caon, M.; Lalanne, D.; Khaled, O.A.; Mugellini, E. Opportunistic synergy: A classifier fusion engine for micro-gesture recognition. In Proceedings of the 5th International Conference on Automotive User Interfaces and Interactive Vehicular Applications - AutomotiveUI ‘13, Eindhoven, Netherlands, 28–30 October 2013; pp. 30–37.
- Angelini, L.; Khaled, O.A.; Caon, M.; Mugellini, E.; Lalanne, D. Hugginess: Encouraging interpersonal touch through smart clothes. In Proceedings of the 2014 ACM International Symposium on Wearable Computers Adjunct Program—ISWC ‘14 Adjunct, Seattle, WA, USA, 13–17 September 2014; pp. 155–162.
- Carrino, F.; Carrino, S.; Caon, M.; Angelini, L.; Khaled, O.A.; Mugellini, E. In-Vehicle natural interaction based on electromyography. In Proceedings of the AutomotiveUI‘12 Adjunct, Portsmouth, NH, USA, 17–19 October 2012; pp. 23–24.
- Manuel D’enseignement Pour La Formation Et L’examen Des Moniteurs De Conduite. Available online: http://www.asa.ch/media/archive1/Shop/Ausbildungsunterlagen/gratis/Leitfaden_Fahrlehrer_Fachgruppe_7_f.pdf (accessed on 15 June 2015).
- Mugellini, E.; Rubegni, E.; Gerardi, S.; Khaled, O.A. Using personal objects as tangible interfaces for memory recollection and sharing. In Proceedings of the 1st international conference on Tangible and embedded interaction, Baton Rouge, LA, USA, 15–17 February 2007; pp. 231–238.
- Beaudouin-Lafon, M. Designing interaction, not interfaces. In Proceedings of the working conference on Advanced visual interfaces, Gallipoli, Italy, 25–28 May 2004; pp. 15–22.
- Beaudouin-Lafon, M.; Mackay, W.E. Reification, polymorphism and reuse: Three principles for designing visual interfaces. In Proceedings of the working conference on Advanced visual interfaces, Palermo, Italy, 24–26 May 2000; pp. 102–109.
- Carrino, S.; Caon, M.; Khaled, O.A.; Ingold, R.; Mugellini, E. Functional gestures for human-environment interaction. In Human-Computer Interaction. Interaction Modalities and Techniques; Springer: Las Vegas, NV, USA, 2013; pp. 167–176. [Google Scholar]
© 2015 by the authors; licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Angelini, L.; Lalanne, D.; Hoven, E.V.d.; Khaled, O.A.; Mugellini, E. Move, Hold and Touch: A Framework for Tangible Gesture Interactive Systems. Machines 2015, 3, 173-207. https://doi.org/10.3390/machines3030173
Angelini L, Lalanne D, Hoven EVd, Khaled OA, Mugellini E. Move, Hold and Touch: A Framework for Tangible Gesture Interactive Systems. Machines. 2015; 3(3):173-207. https://doi.org/10.3390/machines3030173
Chicago/Turabian StyleAngelini, Leonardo, Denis Lalanne, Elise Van den Hoven, Omar Abou Khaled, and Elena Mugellini. 2015. "Move, Hold and Touch: A Framework for Tangible Gesture Interactive Systems" Machines 3, no. 3: 173-207. https://doi.org/10.3390/machines3030173
APA StyleAngelini, L., Lalanne, D., Hoven, E. V. d., Khaled, O. A., & Mugellini, E. (2015). Move, Hold and Touch: A Framework for Tangible Gesture Interactive Systems. Machines, 3(3), 173-207. https://doi.org/10.3390/machines3030173