Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3274247.3274511acmconferencesArticle/Chapter ViewAbstractPublication PagesmigConference Proceedingsconference-collections
short-paper

PuppetPhone: puppeteering virtual characters using a smartphone

Published: 08 November 2018 Publication History

Abstract

Video games enable the representation and control of characters that can agilely evolve in virtual environments. However, the detached character interaction they propose - often using a push-button metaphor - is far from the satisfactory feeling of grasping and moving physical toys. In this paper, we propose a new interaction metaphor that reduces the gap between physical toys and virtual characters. The user moves a smartphone around, and a puppet that responds in real time to the manipulations is seen through the screen. The virtual character moves in order to follow the user gestures, as if it was attached to the phone via a rigid stick. This yields a natural interaction, similar to moving a physical toy, and the puppet now feels alive because its movements are augmented with compelling animations. Using the smartphone, our method ties together the control of the character and camera into a single interaction mechanism. We validate our system by presenting an application in Augmented Reality.

Supplementary Material

MP4 File (a5-anderegg.mp4)

References

[1]
Okan Arikan and D. A. Forsyth. 2002. Interactive Motion Generation from Examples. ACM Trans. Graph. 21, 3 (2002), 483--490.
[2]
Jinxiang Chai and Jessica K. Hodgins. 2005. Performance Animation from Low-dimensional Control Signals. In ACM SIGGRAPH 2005 Papers. 686--696.
[3]
Loïc Ciccone, Martin Guay, Maurizio Nitti, and Robert W. Sumner. 2017. Authoring Motion Cycles. In Proceedings of the ACM SIGGRAPH / Eurographics Symposium on Computer Animation. 8:1--8:9.
[4]
Stelian Coros, Philippe Beaudoin, and Michiel van de Panne. 2010. Generalized Biped Walking Control. ACM Trans. Graph. 29, 4 (2010), Article 130.
[5]
Yaoyuan Cui and Christos Mousas. 2018. Master of Puppets: An Animation-by-Demonstration Computer Puppetry Authoring Framework. 3D Research 9, 1 (2018), 5.
[6]
Karl D. D. Willis, Ivan Poupyrev, and Takaaki Shiratori. 2011. MotionBeam: A metaphor for character interaction with handheld projectors. In Conference on Human Factors in Computing Systems - Proceedings. 1031--1040.
[7]
Mira Dontcheva, Gary Yngve, and Zoran Popović. 2003. Layered Acting for Character Animation. In ACM SIGGRAPH 2003 Papers. 409--416.
[8]
T. Geijtenbeek, N. Pronost, and A. F. van der Stappen. 2012. Simple Data-driven Control for Simulated Bipeds. In Proceedings of the ACM SIGGRAPH/Eurographics Symposium on Computer Animation. 211--219.
[9]
Thomas Geijtenbeek, Michiel van de Panne, and A. Frank van der Stappen. 2013. Flexible Muscle-Based Locomotion for Bipedal Creatures. ACM Transactions on Graphics 32, 6 (2013).
[10]
Martin Guay, Rémi Ronfard, Michael Gleicher, and Marie-Paule Cani. 2015. Space-time Sketching of Character Animation. ACM Trans. Graph. 34, 4 (2015), 118:1--118:10.
[11]
Eom Haegwang, Choi Byungkuk, and Noh Junyong. 2014. Data-Driven Reconstruction of Human Locomotion Using a Single Smartphone. Computer Graphics Forum 33, 7 (2014), 11--19.
[12]
Daniel Holden, Taku Komura, and Jun Saito. 2017. Phase-functioned Neural Networks for Character Control. ACM Trans. Graph. 36, 4 (2017), 42:1--42:13.
[13]
Daniel Holden, Jun Saito, and Taku Komura. 2016. A Deep Learning Framework for Character Motion Synthesis and Editing. ACM Trans. Graph. 35, 4 (2016), 138:1--138:11.
[14]
Jaewoong Jeon, Hyunho Jang, Soon-Bum Lim, and Yoon-Chul Choy. 2010. A sketch interface to empower novices to create 3D animations. Computer Animation and Virtual Worlds 21, 3--4 (2010), 423--432.
[15]
Jongmin Kim, Yeongho Seol, and Jehee Lee. 2012. Realtime Performance Animation Using Sparse 3D Motion Sensors. In Motion in Games. 31--42.
[16]
Joseph Laszlo, Michiel van de Panne, and Eugene Fiume. 2000. Interactive Control for Physically-based Animation. In Proc. of the 27th Annual Conference on Computer Graphics and Interactive Techniques. 201--208.
[17]
Jehee Lee, Jinxiang Chai, Paul S. A. Reitsma, Jessica K. Hodgins, and Nancy S. Pollard. 2002. Interactive Control of Avatars Animated with Human Motion Data. In Proc. of the 29th Annual Conference on Computer Graphics and Interactive Techniques. 491--500.
[18]
Huajun Liu, Xiaolin Wei, Jinxiang Chai, Inwoo Ha, and Taehyun Rhee. 2011. Realtime Human Motion Control with a Small Number of Inertial Sensors. In Symposium on Interactive 3D Graphics and Games. 133--140.
[19]
Libin Liu, KangKang Yin, Michiel van de Panne, Tianjia Shao, and Weiwei Xu. 2010. Sampling-based Contact-rich Motion Control. ACM Trans. Graph. 29, 4 (2010), 128:1--128:10.
[20]
Noah Lockwood and Karan Singh. 2012. Finger Walking: Motion Editing with Contact-based Hand Performance. In Proceedings of the ACM SIGGRAPH/Eurographics Symposium on Computer Animation. 43--52.
[21]
Noah Lockwood and Karan Singh. 2016. Gestural Motion Editing Using Mobile Devices. In Proceedings of the 9th International Conference on Motion in Games. 25--30.
[22]
Jianyuan Min and Jinxiang Chai. 2012. Motion Graphs++: A Compact Generative Model for Semantic Motion Analysis and Synthesis. ACM Trans. Graph. 31, 6 (2012), 153:1--153:12.
[23]
Jianyuan Min, Yen-Lin Chen, and Jinxiang Chai. 2009. Interactive Generation of Human Animation with Deformable Motion Models. ACM Trans. Graph. 29, 1 (2009), 9:1--9:12.
[24]
Michael Neff, Irene Albrecht, and Hans-Peter Seidel. 2007. Layered Performance Animation with Correlation Maps. Comput. Graph. Forum 26 (2007), 675--684.
[25]
Sageev Oore, Demetri Terzopoulos, and Geoffrey Hinton. 2002. A Desktop Input Device and Interface for Interactive 3D Character Animation. In Proceedings of the Graphics Interface 2002 Conference, May 27--29, 2002, Calgary, Alberta, Canada. 133--140.
[26]
T. Pascu, M. White, and Z. Patoli. 2013. Motion capture and activity tracking using smartphone-driven body sensor networks. In Third International Conference on Innovative Computing Technology (INTECH 2013). 456--462.
[27]
Helge Rhodin, James Tompkin, Kwang In Kim, Edilson de Aguiar, Hanspeter Pfister, Hans-Peter Seidel, and Christian Theobalt. 2015. Generalizing Wave Gestures from Sparse Examples for Real-time Character Control. ACM Trans. Graph. 34, 6 (2015), 181:1--181:12.
[28]
Takaaki Shiratori and Jessica K. Hodgins. 2008. Accelerometer-based User Interfaces for the Control of a Physically Simulated Character. In ACM SIGGRAPH Asia 2008 Papers. 123:1--123:9.
[29]
Jaewon Song, Roger Blanco i Ribera, Kyungmin Cho, Mi You, J. P. Lewis, Byungkuk Choi, and Junyong Noh. 2017. Sparse Rig Parameter Optimization for Character Animation. Comput. Graph. Forum 36, 2 (2017), 85--94.
[30]
Jochen Tautges, Arno Zinke, Björn Krüger, Jan Baumann, Andreas Weber, Thomas Helten, Meinard Müller, Hans-Peter Seidel, and Bernd Eberhardt. 2011. Motion Reconstruction Using Sparse Accelerometer Data. ACM Trans. Graph. 30, 3 (2011), 18:1--18:12.
[31]
Matthew Thorne, David Burke, and Michiel van de Panne. 2004. Motion Doodles: An Interface for Sketching Character Motion. In ACM SIGGRAPH 2004 Papers. 424--431.
[32]
Vinayak, Devarajan Ramanujan, Cecil Piya, and Karthik Ramani. 2016. MobiSweep: Exploring Spatial Design Ideation Using a Smartphone As a Hand-held Reference Plane. In Proceedings of the TEI '16: Tenth International Conference on Tangible, Embedded, and Embodied Interaction. 12--20.
[33]
KangKang Yin, Kevin Loken, and Michiel van de Panne. 2007. SIMBICON: Simple Biped Locomotion Control. In ACM SIGGRAPH 2007 Papers.
[34]
V. Zordan, D. Brown, A. Macchietto, and K. Yin. 2014. Control of Rotational Dynamics for Ground and Aerial Behavior. IEEE Transactions on Visualization and Computer Graphics 20, 10 (2014), 1356--1366.

Cited By

View all
  • (2024)Controller influence on self-determination versus performance in a mobile augmented reality platform gameProceedings of the 17th ACM SIGGRAPH Conference on Motion, Interaction, and Games10.1145/3677388.3696332(1-6)Online publication date: 21-Nov-2024
  • (2024)Dragon's Path: Synthesizing User-Centered Flying Creature Animation Paths for Outdoor Augmented Reality ExperiencesACM SIGGRAPH 2024 Conference Papers10.1145/3641519.3657397(1-11)Online publication date: 13-Jul-2024
  • (2024)FingerPuppet: Finger-Walking Performance-based Puppetry for Human AvatarExtended Abstracts of the CHI Conference on Human Factors in Computing Systems10.1145/3613905.3650840(1-6)Online publication date: 11-May-2024
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
MIG '18: Proceedings of the 11th ACM SIGGRAPH Conference on Motion, Interaction and Games
November 2018
185 pages
ISBN:9781450360159
DOI:10.1145/3274247
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 08 November 2018

Permissions

Request permissions for this article.

Check for updates

Badges

  • Best Short Paper

Author Tags

  1. character control
  2. interaction
  3. puppet
  4. smartphone

Qualifiers

  • Short-paper

Funding Sources

  • European Union

Conference

MIG '18
Sponsor:
MIG '18: Motion, Interaction and Games
November 8 - 10, 2018
Limassol, Cyprus

Acceptance Rates

Overall Acceptance Rate -9 of -9 submissions, 100%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)38
  • Downloads (Last 6 weeks)3
Reflects downloads up to 21 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Controller influence on self-determination versus performance in a mobile augmented reality platform gameProceedings of the 17th ACM SIGGRAPH Conference on Motion, Interaction, and Games10.1145/3677388.3696332(1-6)Online publication date: 21-Nov-2024
  • (2024)Dragon's Path: Synthesizing User-Centered Flying Creature Animation Paths for Outdoor Augmented Reality ExperiencesACM SIGGRAPH 2024 Conference Papers10.1145/3641519.3657397(1-11)Online publication date: 13-Jul-2024
  • (2024)FingerPuppet: Finger-Walking Performance-based Puppetry for Human AvatarExtended Abstracts of the CHI Conference on Human Factors in Computing Systems10.1145/3613905.3650840(1-6)Online publication date: 11-May-2024
  • (2024)Interactive Impulses in VR: A Modular Approach to Combine Stop-Motion Animation and Experimental Puppeteering in a Narrative Virtual Environment2024 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)10.1109/VRW62533.2024.00022(90-96)Online publication date: 16-Mar-2024
  • (2023)Double Doodles: Sketching Animation in Immersive Environment With 3+6 DOFs Motion GesturesProceedings of the 31st ACM International Conference on Multimedia10.1145/3581783.3613783(6998-7006)Online publication date: 26-Oct-2023
  • (2023)Location-Aware Adaptation of Augmented Reality NarrativesProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580978(1-15)Online publication date: 19-Apr-2023
  • (2023)Transfer4D: A Framework for Frugal Motion Capture and Deformation Transfer2023 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)10.1109/CVPR52729.2023.01234(12836-12846)Online publication date: Jun-2023
  • (2022)Understanding and Creating Spatial Interactions with Distant Displays Enabled by Unmodified Off-The-Shelf SmartphonesMultimodal Technologies and Interaction10.3390/mti61000946:10(94)Online publication date: 19-Oct-2022
  • (2022)VCPoser: Interactive Pose Generation of Virtual Characters Corresponding to Human Pose InputProceedings of the 28th ACM Symposium on Virtual Reality Software and Technology10.1145/3562939.3565640(1-10)Online publication date: 29-Nov-2022
  • (2022)Interactive augmented reality storytelling guided by scene semanticsACM Transactions on Graphics10.1145/3528223.353006141:4(1-15)Online publication date: 22-Jul-2022
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media