Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3526113.3545626acmconferencesArticle/Chapter ViewAbstractPublication PagesuistConference Proceedingsconference-collections
research-article

Sketched Reality: Sketching Bi-Directional Interactions Between Virtual and Physical Worlds with AR and Actuated Tangible UI

Published: 28 October 2022 Publication History

Abstract

This paper introduces Sketched Reality, an approach that combines AR sketching and actuated tangible user interfaces (TUI) for bi-directional sketching interaction. Bi-directional sketching enables virtual sketches and physical objects to “affect” each other through physical actuation and digital computation. In the existing AR sketching, the relationship between virtual and physical worlds is only one-directional — while physical interaction can affect virtual sketches, virtual sketches have no return effect on the physical objects or environment. In contrast, bi-directional sketching interaction allows the seamless coupling between sketches and actuated TUIs. In this paper, we employ tabletop-size small robots (Sony Toio) and an iPad-based AR sketching tool to demonstrate the concept. In our system, virtual sketches drawn and simulated on an iPad (e.g., lines, walls, pendulums, and springs) can move, actuate, collide, and constrain physical Toio robots, as if virtual sketches and the physical objects exist in the same space through seamless coupling between AR and robot motion. This paper contributes a set of novel interactions and a design space of bi-directional AR sketching. We demonstrate a series of potential applications, such as tangible physics education, explorable mechanism, tangible gaming for children, and in-situ robot programming via sketching.

References

[1]
8th Wall Inc.2022. 8th Wall. https://www.8thwall.com/
[2]
Jason Alexander, Anne Roudaut, Jürgen Steimle, Kasper Hornbæk, Miguel Bruns Alonso, Sean Follmer, and Timothy Merritt. 2018. Grand challenges in shape-changing interface research. In Proceedings of the 2018 CHI conference on human factors in computing systems. 1–14.
[3]
Takafumi Aoki, Takashi Matsushita, Yuichiro Iio, Hironori Mitake, Takashi Toyama, Shoichi Hasegawa, Rikiya Ayukawa, Hiroshi Ichikawa, Makoto Sato, Takatsugu Kuriyama, 2005. Kobito: virtual brownies. In ACM SIGGRAPH 2005 emerging technologies. 11–es.
[4]
Rahul Arora, Rubaiat Habib Kazi, Tovi Grossman, George Fitzmaurice, and Karan Singh. 2018. Symbiosissketch: Combining 2d & 3d sketching for designing detailed 3d objects in situ. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. ACM, 185.
[5]
Rahul Arora, Rubaiat Habib Kazi, Fraser Anderson, Tovi Grossman, Karan Singh, and George W Fitzmaurice. 2017. Experimental Evaluation of Sketching on Surfaces in VR. In CHI, Vol. 17. 5643–5654.
[6]
Federico Boniardi, Abhinav Valada, Wolfram Burgard, and Gian Diego Tipaldi. 2016. Autonomous indoor robot navigation using a sketch interface for drawing maps and routes. In 2016 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 2896–2901.
[7]
Marco Cavallo, Mishal Dholakia, Matous Havlena, Kenneth Ocheltree, and Mark Podlaseck. 2019. Dataspace: A reconfigurable hybrid reality environment for collaborative information analysis. In 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR). IEEE, 145–153.
[8]
Marcelo Coelho and Jamie Zigelbaum. 2011. Shape-changing interfaces. Personal and Ubiquitous Computing 15, 2 (2011), 161–173.
[9]
Tobias Drey, Jan Gugenheimer, Julian Karlbauer, Maximilian Milo, and Enrico Rukzio. 2020. VRSketchIn: Exploring the Design Space of Pen and Tablet Interaction for 3D Sketching in Virtual Reality. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. 1–14.
[10]
Sean Follmer, Daniel Leithinger, Alex Olwal, Akimitsu Hogge, and Hiroshi Ishii. 2013. inFORM: dynamic physical affordances and constraints through shape and object actuation. In Uist, Vol. 13. 2501–988.
[11]
Danilo Gasques, Janet G Johnson, Tommy Sharkey, and Nadir Weibel. 2019. What you sketch is what you get: Quick and easy augmented reality prototyping with pintar. In Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems. 1–6.
[12]
Jiangtao Gong, Teng Han, Siling Guo, Jiannan Li, Siyu Zha, Liuxin Zhang, Feng Tian, Qianying Wang, and Yong Rui. 2021. HoloBoard: a Large-format Immersive Teaching Board based on pseudo HoloGraphics. In The 34th Annual ACM Symposium on User Interface Software and Technology. 441–456.
[13]
Jan Gugenheimer, Evgeny Stemasov, Julian Frommel, and Enrico Rukzio. 2017. Sharevr: Enabling co-located experiences for virtual reality between hmd and non-hmd users. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. 4021–4033.
[14]
Jeremy Hartmann, Yen-Ting Yeh, and Daniel Vogel. 2020. AAR: Augmenting a wearable augmented reality display with an actuated head-mounted projector. In Proceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology. 445–458.
[15]
Valentin Heun, James Hobin, and Pattie Maes. 2013. Reality editor: programming smarter objects. In Proceedings of the 2013 ACM conference on Pervasive and ubiquitous computing adjunct publication. 307–310.
[16]
Google Inc.2016. TiltBrush. https://www.tiltbrush.com/
[17]
Google Inc.2018. Just a Line. https://justaline.withgoogle.com/
[18]
Gravity Sketch Inc.2017. Gravity Sketch. https://www.gravitysketch.com/
[19]
PTC Inc.2017. Vuforia Chalk AR. https://chalk.vuforia.com/
[20]
Hiroshi Ishii, Dávid Lakatos, Leonardo Bonanni, and Jean-Baptiste Labrune. 2012. Radical atoms: beyond tangible bits, toward transformable materials. interactions 19, 1 (2012), 38–51.
[21]
Kentaro Ishii, Shengdong Zhao, Masahiko Inami, Takeo Igarashi, and Michita Imai. 2009. Designing laser gesture interface for robot control. In IFIP Conference on Human-Computer Interaction. Springer, 479–492.
[22]
Yunwoo Jeong, Han-Jong Kim, and Tek-Jin Nam. 2018. Mechanism perfboard: An augmented reality environment for linkage mechanism design and fabrication. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. 1–11.
[23]
Shunichi Kasahara, Ryuma Niiyama, Valentin Heun, and Hiroshi Ishii. 2013. exTouch: spatially-aware embodied manipulation of actuated objects mediated by augmented reality. In Proceedings of the 7th International Conference on Tangible, Embedded and Embodied Interaction. 223–228.
[24]
Lawrence H Kim, Daniel S Drew, Veronika Domova, and Sean Follmer. 2020. User-defined swarm robot control. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. 1–13.
[25]
Lawrence H Kim and Sean Follmer. 2019. Swarmhaptics: Haptic display with swarm robots. In Proceedings of the 2019 CHI conference on human factors in computing systems. 1–13.
[26]
Yongkwan Kim and Seok-Hyung Bae. 2016. SketchingWithHands: 3D sketching handheld products with first-person hand posture. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology. 797–808.
[27]
Joseph J LaViola Jr and Robert C Zeleznik. 2006. Mathpad2: a system for the creation and exploration of mathematical sketches. In ACM SIGGRAPH 2006 Courses. 33–es.
[28]
Mathieu Le Goc, Lawrence H Kim, Ali Parsaei, Jean-Daniel Fekete, Pierre Dragicevic, and Sean Follmer. 2016. Zooids: Building blocks for swarm user interfaces. In Proceedings of the 29th annual symposium on user interface software and technology. 97–109.
[29]
Jinha Lee, Rehmi Post, and Hiroshi Ishii. 2011. ZeroN: mid-air tangible interaction enabled by computer controlled magnetic levitation. In Proceedings of the 24th annual ACM symposium on User interface software and technology. 327–336.
[30]
Yujin Lee, Myeongseong Kim, and Hyunjung Kim. 2020. Rolling Pixels: Robotic Steinmetz Solids for Creating Physical Animations. In Proceedings of the Fourteenth International Conference on Tangible, Embedded, and Embodied Interaction. 557–564.
[31]
Yuwei Li, Xi Luo, Youyi Zheng, Pengfei Xu, and Hongbo Fu. 2017. SweepCanvas: Sketch-based 3D prototyping on an RGB-D image. In Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology. 387–399.
[32]
Zhanat Makhataeva and Huseyin Atakan Varol. 2020. Augmented reality for robotics: A review. Robotics 9, 2 (2020), 21.
[33]
Mark Marshall, Thomas Carter, Jason Alexander, and Sriram Subramanian. 2012. Ultra-tangibles: creating movable tangible objects on interactive tables. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 2185–2188.
[34]
Stefanie Mueller, Pedro Lopes, and Patrick Baudisch. 2012. Interactive construction: interactive fabrication of functional mechanical devices. In Proceedings of the 25th annual ACM symposium on User interface software and technology. 599–606.
[35]
Ken Nakagaki, Joanne Leong, Jordan L Tappa, João Wilbert, and Hiroshi Ishii. 2020. Hermits: Dynamically reconfiguring the interactivity of self-propelled tuis with mechanical shell add-ons. In Proceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology. 882–896.
[36]
Ken Nakagaki, Udayan Umapathi, Daniel Leithinger, and Hiroshi Ishii. 2017. AnimaStage: hands-on animated craft on pin-based shape displays. In Proceedings of the 2017 Conference on Designing Interactive Systems. 1093–1097.
[37]
Ryosuke Nakayama, Ryo Suzuki, Satoshi Nakamaru, Ryuma Niiyama, Yoshihiro Kawahara, and Yasuaki Kakehi. 2019. Morphio: Entirely soft sensing and actuation modules for programming shape changes through tangible interaction. In Proceedings of the 2019 on Designing Interactive Systems Conference. 975–986.
[38]
Diana Nowacka, Karim Ladha, Nils Y Hammerla, Daniel Jackson, Cassim Ladha, Enrico Rukzio, and Patrick Olivier. 2013. Touchbugs: Actuated tangibles on multi-touch tables. In Proceedings of the SIGCHI conference on human factors in computing systems. 759–762.
[39]
Gian Pangaro, Dan Maynes-Aminzade, and Hiroshi Ishii. 2002. The actuated workbench: computer-controlled actuation in tabletop tangible interfaces. In Proceedings of the 15th annual ACM symposium on User interface software and technology. 181–190.
[40]
James Patten and Hiroshi Ishii. 2007. Mechanical constraints as computational constraints in tabletop tangible interfaces. In Proceedings of the SIGCHI conference on Human factors in computing systems. 809–818.
[41]
Ivan Poupyrev, Tatsushi Nashida, and Makoto Okabe. 2007. Actuation and tangible user interfaces: the Vaucanson duck, robots, and shape displays. In Proceedings of the 1st international conference on Tangible and embedded interaction. 205–212.
[42]
Hayes Solos Raffle, Amanda J Parkes, and Hiroshi Ishii. 2004. Topobo: a constructive assembly system with kinetic memory. In Proceedings of the SIGCHI conference on Human factors in computing systems. 647–654.
[43]
Shwetha Rajaram and Michael Nebeling. 2022. Paper Trail: An Immersive Authoring System for Augmented Reality Instructional Experiences. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems. 1–14.
[44]
Majken K Rasmussen, Esben W Pedersen, Marianne G Petersen, and Kasper Hornbæk. 2012. Shape-changing interfaces: a review of the design space and open research questions. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 735–744.
[45]
ROVIO. 2022. Angry Birds. https://www.angrybirds.com/
[46]
Jeremy Scott and Randall Davis. 2013. Physink: sketching physical behavior. In Proceedings of the adjunct publication of the 26th annual ACM symposium on User interface software and technology. 9–10.
[47]
Vachirasuk Setalaphruk, Atsushi Ueno, Izuru Kume, Yasuyuki Kono, and Masatsugu Kidode. 2003. Robot navigation in corridor environments using a sketch floor map. In Proceedings 2003 IEEE International Symposium on Computational Intelligence in Robotics and Automation. Computational Intelligence in Robotics and Automation for the New Millennium (Cat. No. 03EX694), Vol. 2. IEEE, 552–557.
[48]
Danelle Shah, Joseph Schneider, and Mark Campbell. 2010. A robust sketch interface for natural robot control. In 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE, 4458–4463.
[49]
Maki Sugimoto, Georges Kagotani, Minoru Kojima, Hideaki Nii, Akihiro Nakamura, and Masahiko Inami. 2005. Augmented coliseum: display-based computing for augmented reality inspiration computing robot. In ACM SIGGRAPH 2005 Emerging technologies. 1–es.
[50]
Ryo Suzuki, Adnan Karim, Tian Xia, Hooman Hedayati, and Nicolai Marquardt. 2022. Augmented Reality and Robotics: A Survey and Taxonomy for AR-enhanced Human-Robot Interaction and Robotic Interfaces. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems. 1–32. https://doi.org/10.1145/1122445.1122456
[51]
Ryo Suzuki, Jun Kato, Mark D Gross, and Tom Yeh. 2018. Reactile: Programming swarm user interfaces through direct physical manipulation. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. 1–13.
[52]
Ryo Suzuki, Rubaiat Habib Kazi, Li-Yi Wei, Stephen DiVerdi, Wilmot Li, and Daniel Leithinger. 2020. Realitysketch: Embedding responsive graphics and visualizations in AR through dynamic sketching. In Proceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology. 166–181.
[53]
Ryo Suzuki, Eyal Ofek, Mike Sinclair, Daniel Leithinger, and Mar Gonzalez-Franco. 2021. HapticBots: Distributed Encountered-type Haptics for VR with Multiple Shape-changing Mobile Robots. In The 34th Annual ACM Symposium on User Interface Software and Technology. 1269–1281.
[54]
Ryo Suzuki, Clement Zheng, Yasuaki Kakehi, Tom Yeh, Ellen Yi-Luen Do, Mark D Gross, and Daniel Leithinger. 2019. Shapebots: Shape-changing swarm robots. In Proceedings of the 32nd annual ACM symposium on user interface software and technology. 493–505.
[55]
Michael Walker, Hooman Hedayati, Jennifer Lee, and Daniel Szafir. 2018. Communicating robot motion intent with augmented reality. In Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction. 316–324.
[56]
Robert Xiao, Julia Schwarz, Nick Throm, Andrew D Wilson, and Hrvoje Benko. 2018. MRTouch: Adding touch input to head-mounted mixed reality. IEEE transactions on visualization and computer graphics 24, 4(2018), 1653–1660.
[57]
Emilie Yu, Rahul Arora, Tibor Stanko, J Andreas Bærentzen, Karan Singh, and Adrien Bousseau. 2021. Cassie: Curve and surface sketching in immersive environments. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. 1–14.
[58]
Fengyuan Zhu and Tovi Grossman. 2020. Bishare: Exploring bidirectional interactions between smartphones and head-mounted augmented reality. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. 1–14.

Cited By

View all
  • (2024)Co-Designing Programmable Fidgeting Experience with Swarm Robots for Adults with ADHDProceedings of the 26th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3663548.3675614(1-14)Online publication date: 27-Oct-2024
  • (2024)Augmented Physics: Creating Interactive and Embedded Physics Simulations from Static Textbook DiagramsProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676392(1-12)Online publication date: 13-Oct-2024
  • (2024)AniCraft: Crafting Everyday Objects as Physical Proxies for Prototyping 3D Character Animation in Mixed RealityProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676325(1-14)Online publication date: 13-Oct-2024
  • Show More Cited By

Index Terms

  1. Sketched Reality: Sketching Bi-Directional Interactions Between Virtual and Physical Worlds with AR and Actuated Tangible UI

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    UIST '22: Proceedings of the 35th Annual ACM Symposium on User Interface Software and Technology
    October 2022
    1363 pages
    ISBN:9781450393201
    DOI:10.1145/3526113
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 28 October 2022

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. actuated tangible interfaces
    2. augmented reality
    3. mixed reality
    4. swarm user interfaces

    Qualifiers

    • Research-article
    • Research
    • Refereed limited

    Funding Sources

    Conference

    UIST '22

    Acceptance Rates

    Overall Acceptance Rate 561 of 2,567 submissions, 22%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)618
    • Downloads (Last 6 weeks)76
    Reflects downloads up to 30 Nov 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Co-Designing Programmable Fidgeting Experience with Swarm Robots for Adults with ADHDProceedings of the 26th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3663548.3675614(1-14)Online publication date: 27-Oct-2024
    • (2024)Augmented Physics: Creating Interactive and Embedded Physics Simulations from Static Textbook DiagramsProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676392(1-12)Online publication date: 13-Oct-2024
    • (2024)AniCraft: Crafting Everyday Objects as Physical Proxies for Prototyping 3D Character Animation in Mixed RealityProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676325(1-14)Online publication date: 13-Oct-2024
    • (2024)RealityEffects: Augmenting 3D Volumetric Videos with Object-Centric Annotation and Dynamic Visual EffectsProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661631(1248-1261)Online publication date: 1-Jul-2024
    • (2024)Swarm Body: Embodied Swarm RobotsProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642870(1-19)Online publication date: 11-May-2024
    • (2024)Data Cubes in Hand: A Design Space of Tangible Cubes for Visualizing 3D Spatio-Temporal Data in Mixed RealityProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642740(1-21)Online publication date: 11-May-2024
    • (2024)DungeonMaker: Embedding Tangible Creation and Destruction in Hybrid Board Games through Personal Fabrication TechnologyProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642243(1-20)Online publication date: 11-May-2024
    • (2024)pARam: Leveraging Parametric Design in Extended Reality to Support the Personalization of Artifacts for Personal FabricationProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642083(1-22)Online publication date: 11-May-2024
    • (2023)BrickStARt: Enabling In-situ Design and Tangible Exploration for Personal Fabrication using Mixed RealityProceedings of the ACM on Human-Computer Interaction10.1145/36264657:ISS(64-92)Online publication date: 1-Nov-2023
    • (2023)A Scoping Survey on Cross-reality SystemsACM Computing Surveys10.1145/361653656:4(1-38)Online publication date: 21-Oct-2023
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media