Nothing Special   »   [go: up one dir, main page]

skip to main content
research-article

Cone of Vision as a Behavioural Cue for VR Collaboration

Published: 11 November 2022 Publication History

Abstract

Mutual awareness of visual attention is essential for collaborative work. In the field of collaborative virtual environments (CVE), it has been proposed to use Field-of-View (FoV) frustum visualisations as a cue to support mutual awareness during collaboration. Recent studies on FoV frustum visualisations focus on asymmetric collaboration with AR/VR hardware setups and 3D reconstructed environments. In contrast, we focus on the general-purpose CVEs (i.e., VR shared offices), whose popularity is increasing due to the availability of low-cost headsets, and the restrictions imposed by the pandemic. In these CVEs collaboration roles are symmetrical, and the same 2D content available on desktop computers is displayed on 2D surfaces in a 3D space (VR screens). We prototyped one such CVE to evaluate FoV frustrum visualisation within this collaboration scenario. We also implement a FoV visualisation generated from an average fixation map (AFM), therefore directly generated by users' gaze behaviour which we call Cone of Vision (CoV). Our approach to displaying the frustum visualisations is tailored for 2D surfaces in 3D space and allows for self-awareness of this visual cue. We evaluate CoV in the context of a general exploratory data analysis (EDA) with 10 pairs of participants. Our findings indicate that CoV is beneficial during shifts between independent and collaborative work and supports collaborative progression across the visualisation. Self-perception of the CoV improves visual attention coupling, reduces the number of times users watch the collaborator's avatars and offers a consistent representation of the shared reality.

References

[1]
Ioannis Agtzidis, Mikhail Startsev, and Michael Dorr. 2019. 360-degree Video Gaze Behaviour: A ground-truth data set and a classification algorithm for eye movements. Technical Report. 1007--1015 pages. https://doi.org/10.1145/3343031.3350947
[2]
Sunggeun Ahn, Jeongmin Son, Sangyoon Lee, and Geehyuk Lee. 2020. Verge-it: Gaze interaction for a binocular head-worn display using modulated disparity vergence eye movement. Conference on Human Factors in Computing Systems - Proceedings (2020), 1--7. https://doi.org/10.1145/3334480.3382908
[3]
Thomas Alsop. 2021. Virtual reality (VR) headset unit sales worldwide in 4th quarter 2019 and 4th quarter 2020, by device. https://www.statista.com/statistics/987701/vr-unit-sales-brand/
[4]
Anand Agarwala; Jinha Lee. 2021. Spatial.io. https://spatial.io/
[5]
Chris Anton and Rasmus Larsen. 2021. meetinvr. (2021). https://www.meetinvr.com/
[6]
Rowel Atienza, Ryan Blonna, Maria Isabel Saludares, Joel Casimiro, and Vivencio Fuentes. 2016. Interaction techniques using head gaze for virtual reality. Proceedings - 2016 IEEE Region 10 Symposium, TENSYMP 2016 (2016), 110--114. https://doi.org/10.1109/TENCONSpring.2016.7519387
[7]
Per Bækgaard, John Paulin Hansen, Katsumi Minakata, and I. Scott MacKenzie. 2019. A fitts' law study of pupil dilations in a head-mounted display. Eye Tracking Research and Applications Symposium (ETRA) (2019). https://doi.org/10.1145/3314111.3319831
[8]
Huidong Bai, Prasanth Sasikumar, Jing Yang, and Mark Billinghurst. 2020. A User Study on Mixed Reality Remote Collaboration with Eye Gaze and Hand Gesture Sharing. Conference on Human Factors in Computing Systems - Proceedings (2020). https://doi.org/10.1145/3313831.3376550
[9]
Alison Bechdel. 1985. Bechdel Test. https://en.wikipedia.org/wiki/Bechdel_test.
[10]
B. Biguer, M. Jeannerod, and C. Prablanc. 1982. The coordination of eye, head, and arm movements during reaching at a single visual target. Experimental Brain Research 46, 2 (5 1982), 301--304. https://doi.org/10.1007/BF00237188
[11]
Frank Biocca, Chad Harms, and Jenn Gregg. 2001. The networked minds measure of social presence: Pilot test of the factor structure and concurrent validity. 4th Annual International Workshop on Presence December (2001), 1--9. http://astro.temple.edu/~lombard/ISPR/Proceedings/2001/Biocca2.pdf
[12]
Pieter Blignaut and Daniël Wium. 2014. Eye-tracking data quality as affected by ethnicity and experimental design. Behavior Research Methods 46, 1 (2014), 67--80. https://doi.org/10.3758/s13428-013-0343-0
[13]
Riccardo Bovo, Daniele Giunchi, Anthony Steed, and Thomas Heinis. 2022. MR-RIEW: An MR Toolkit for Designing Remote Immersive Experiment Workflows. In 2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW). IEEE, 766--767.
[14]
Sharon Brener. 2013. Bechdel test dataset. https://data.world/sharon/bechdel-test/workspace/file?filename=movies.csv.
[15]
Wolfgang Buschel, Anke Lehmann, and Raimund Dachselt. 2021. Miria: A mixed reality toolkit for the in-situ visualization and analysis of spatio-temporal interaction data. Conference on Human Factors in Computing Systems - Proceedings (2021). https://doi.org/10.1145/3411764.3445651
[16]
Simon Butscher, Sebastian Hubenschmid, Jens Müller, Johannes Fuchs, and Harald Reiterer. 2018. Clusters, trends, and outliers: How Immersive technologies can facilitate the collaborative analysis of multidimensional data. Conference on Human Factors in Computing Systems - Proceedings 2018-April (2018). https://doi.org/10.1145/3173574.3173664
[17]
Marco Cavallo, Mishal Dholakia, Matous Havlena, Kenneth Ocheltree, and Mark Podlaseck. 2019. Dataspace: A reconfigurable hybrid reality environment for collaborative information analysis. 26th IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2019 - Proceedings (2019), 145--153. https://doi.org/10.1109/VR.2019.8797733
[18]
Marco Cavallo, Mishal Dholakia, Matous Havlena, Kenneth Ocheltree, and Mark Podlaseck. 2019. Immersive insights: A hybrid analytics system for collaborative exploratory data analysis. Proceedings of the ACM Symposium on Virtual Reality Software and Technology, VRST (2019). https://doi.org/10.1145/3359996.3364242
[19]
Maxime Cordeil, Tim Dwyer, Karsten Klein, Bireswar Laha, Kim Marriott, and Bruce H. Thomas. 2017. Immersive Collaborative Analysis of Network Connectivity: CAVE-style or Head-Mounted Display? IEEE Transactions on Visualization and Computer Graphics 23, 1 (2017), 441--450. https://doi.org/10.1109/TVCG.2016.2599107
[20]
Sarah D'Angelo and Andrew Begel. 2017. Improving communication between pair programmers using shared gaze awareness. Conference on Human Factors in Computing Systems - Proceedings 2017-Janua (2017), 6245--6255. https://doi.org/10.1145/3025453.3025573
[21]
David Whelan. 2021. engage vr. https://engagevr.io/
[22]
Marie Diener-west, Jeff Sauro, and John Brooke. 2020. SUS: A 'Quick and Dirty' Usability Scale. Usability Evaluation In Industry (2020), 207--212. https://doi.org/10.1201/9781498710411--35
[23]
Joan Morris Dimicco, Anna Pandolfo, and Walter Bender. 2004. Influencing Group Participation with a Shared Display. (2004).
[24]
Ciro Donalek, S. G. Djorgovski, Alex Cioc, Anwell Wang, Jerry Zhang, Elizabeth Lawler, Stacy Yeh, Ashish Mahabal, Matthew Graham, Andrew Drake, Scott Davidoff, Jeffrey S. Norris, and Giuseppe Longo. 2015. Immersive and collaborative data visualization using virtual reality platforms. Proceedings - 2014 IEEE International Conference on Big Data, IEEE Big Data 2014 (2015), 609--614. https://doi.org/10.1109/BigData.2014.7004282
[25]
Heiko Drewes, Mohamed Khamis, and Florian Alt. 2018. Smooth pursuit target speeds and trajectories. ACM International Conference Proceeding Series (2018), 139--146. https://doi.org/10.1145/3282894.3282913
[26]
Alessandro Febretti, Arthur Nishimoto, Terrance Thigpen, Jonas Talandis, Lance Long, J. D. Pirtle, Tom Peterka, Alan Verlo, Maxine Brown, Dana Plepys, Dan Sandin, Luc Renambot, Andrew Johnson, and Jason Leigh. 2013. CAVE2: a hybrid reality environment for immersive simulation and information analysis. The Engineering Reality of Virtual Reality 2013 8649, March 2013 (2013), 864903. https://doi.org/10.1117/12.2005484
[27]
Martin Feick, Niko Kleer, Anthony Tang, and Antonio Krüger. 2020. The Virtual Reality Questionnaire Toolkit. In Adjunct Publication of the 33rd Annual ACM Symposium on User Interface Software and Technology (Virtual Event, USA) (UIST '20 Adjunct). Association for Computing Machinery, 68--69. https://doi.org/10.1145/3379350.3416188
[28]
Virgilio F Ferrario, Chiarella Sforza, Graziano Serrao, GianPiero Grassi, and Erio Mossi. 2002. Active range of motion of the head and cervical spine: a three-dimensional investigation in healthy young adults. Journal of orthopaedic research 20, 1 (2002), 122--129.
[29]
Mike Fraser, Steve Benford, Jon Hindmarsh, and Christian Heath. 1999. Supporting awareness and interaction through collaborative virtual interfaces. In UIST (User Interface Software and Technology): Proceedings of the ACM Symposium, Vol. 1. ACM, New York, New York, USA, 27--36. https://doi.org/10.1145/320719.322580
[30]
B. Fröhler, C. Anthes, F. Pointecker, J. Friedl, D. Schwajda, A. Riegler, S. Tripathi, C. Holzmann, M. Brunner, H. Jodlbauer, H. C. Jetter, and C. Heinzl. 2022. A Survey on Cross-Virtuality Analytics. Computer Graphics Forum 41, 1 (2022), 465--494. https://doi.org/10.1111/cgf.14447
[31]
Adam Glazier. 2017. Designing Screen Interfaces for VR (Google I/O '17). https://www.youtube.com/watch?v=ES9jArHRFHQ
[32]
Sandra G. Hart and Lowell E. Staveland. 1988. Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical Research. Advances in Psychology 52, C (1988), 139--183. https://doi.org/10.1016/S0166--4115(08)62386--9
[33]
Keita Higuch, Ryo Yonetani, and Yoichi Sato. 2016. Can eye help you?: Effects of visualizing eye fixations on remote collaboration scenarios for physical tasks. In Conference on Human Factors in Computing Systems - Proceedings. Association for Computing Machinery, New York, NY, USA, 5180--5190. https://doi.org/10.1145/2858036.2858438
[34]
Jon Hindmarsh, Mike Fraser, Christian Heath, Steve Benford, and Chris Greenhalgh. 1998. Fragmented interaction: Establishing mutual orientation in virtual environments. In Proceedings of the ACM Conference on Computer Supported Cooperative Work. ACM Press, New York, New York, USA, 217--226.
[35]
Adrian H. Hoppe, Kai Westerkamp, Sebastian Maier, Florian van de Camp, and Rainer Stiefelhagen. 2018. Multi-user collaboration on complex data in virtual and augmented reality. Communications in Computer and Information Science 851 (2018), 258--265. https://doi.org/10.1007/978--3--319--92279--9{_}35
[36]
Alex Howland. 2021. Virbela. https://www.virbela.com/
[37]
Zhiming Hu, Congyi Zhang, Sheng Li, Guoping Wang, and DInesh Manocha. 2019. SGaze: A Data-Driven Eye-Head Coordination Model for Realtime Gaze Prediction. IEEE Transactions on Visualization and Computer Graphics 25, 5 (2019), 2002--2010. https://doi.org/10.1109/TVCG.2019.2899187
[38]
IMDb. 1990. IMDb Datasets. https://www.imdb.com/interfaces/.
[39]
Allison Jing, Kieran William May, Mahnoor Naeem, Gun Lee, and Mark Billinghurst. 2021. EyemR-Vis: Using Bi- Directional Gaze Behavioural Cues to Improve Mixed Reality Remote Collaboration. In Conference on Human Factors in Computing Systems - Proceedings. Association for Computing Machinery. https://doi.org/10.1145/3411763.3451844
[40]
Rakshit Kothari, Zhizhuo Yang, Christopher Kanan, Reynold Bailey, Jeff B. Pelz, and Gabriel J. Diaz. 2020. Gaze-inwild: A dataset for studying eye and head coordination in everyday activities. Scientific Reports 10, 1 (2020), 1--23. https://doi.org/10.1038/s41598-020--59251--5
[41]
Mikko Kytö, Barrett Ens, Piumsomboon Thammathip, Lee Gun A., and Mark Billinghurst. 2018. Pinpointing. Education Training 23, 3 (2018), 88--89. https://doi.org/10.1108/eb016812
[42]
Michael F. Land. 2009. Vision, eye movements, and natural behavior. Visual Neuroscience 26, 1 (2009), 51--62. https://doi.org/10.1017/S0952523808080899
[43]
Benjamin Lee, Xiaoyun Hu, Maxime Cordeil, Arnaud Prouzeau, Bernhard Jenny, and Tim Dwyer. 2021. Shared surfaces and spaces: Collaborative data visualisation in a co-located immersive environment. IEEE Transactions on Visualization and Computer Graphics 27, 2 (2021), 1171--1181. https://doi.org/10.1109/TVCG.2020.3030450
[44]
James R. Lewis. 2018. The System Usability Scale: Past, Present, and Future. International Journal of Human-Computer Interaction 34, 7 (2018), 577--590. https://doi.org/10.1080/10447318.2018.1455307
[45]
Yin Li, Alireza Fathi, and James M Rehg. 2013. Learning to Predict Gaze in Egocentric Video. (2013), 3216--3223. https://doi.org/10.1109/ICCV.2013.399
[46]
Tahir Mahmood, Erik Butler, Nicholas Davis, Jian Huang, and Aidong Lu. 2018. Building Multiple Coordinated Spaces for Effective Immersive Analytics through Distributed Cognition. 2018 International Symposium on Big Data Visual and Immersive Analytics, BDVA 2018 (2018). https://doi.org/10.1109/BDVA.2018.8533893
[47]
Mark McGill, Aidan Kehoe, Euan Freeman, and Stephen Brewster. 2020. Expanding the Bounds of Seated Virtual Workspaces. ACM Transactions on Computer-Human Interaction 27, 3 (2020). https://doi.org/10.1145/3380959
[48]
Meta. 2020. Infinite Office. https://www.youtube.com/watch?v=ES9jArHRFHQ
[49]
David Nguyen and John Canny. 2005. MultiView: Spatially faithful group video conferencing. CHI 2005: Technology, Safety, Community: Conference Proceedings - Conference on Human Factors in Computing Systems (2005), 799--808.
[50]
Thi Thuong Huyen Nguyen and Thierry Duval. 2015. A survey of communication and awareness in collaborative virtual environments. In 2014 International Workshop on Collaborative Virtual Environments, 3DCVE 2014. Institute of Electrical and Electronics Engineers Inc., 1--8. https://doi.org/10.1109/3DCVE.2014.7160928
[51]
Diederick C. Niehorster, Thiago Santini, Roy S. Hessels, Ignace T.C. Hooge, Enkelejda Kasneci, and Marcus Nyström. 2020. The impact of slippage on the data quality of head-worn eye trackers. Behavior Research Methods 52, 3 (2020), 1140--1160. https://doi.org/10.3758/s13428-019-01307-0
[52]
Arthur Nishimoto and Andrew Johnson. 2019. Extending virtual reality displaywall environments using augmented reality. Proceedings - SUI 2019: ACM Conference on Spatial User Interaction (oct 2019). https://doi.org/10.1145/3357251. 3357579
[53]
Chris North. 2006. Toward measuring visualization insight. IEEE Computer Graphics and Applications 26, 3 (2006), 6--9. https://doi.org/10.1109/MCG.2006.70
[54]
Marc-Antoine Nüssli. 2011. Dual Eye-Tracking Methods for the Study of Remote Collaborative Problem Solving. PhD Thesis, ÉCOLE POLYTECHNIQUE FÉDÉRALE DE LAUSANNE 5232 (2011). https://doi.org/10.5075/epfl-thesis-5232
[55]
Santtu Parikka and Juha Ruistola. 2021. Glue Collaboration. https://glue.work/
[56]
Leonardo Pavanatto, Chris North, Doug A. Bowman, Carmen Badea, and Richard Stoakley. 2021. Do we still need physical monitors? An evaluation of the usability of AR virtual monitors for productivity work. Proceedings - 2021 IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2021 (2021), 759--767. https://doi.org/10.1109/VR50410. 2021.00103
[57]
Jeff Pelz, Mary Hayhoe, and Russ Loeber. 2001. The coordination of eye, head, and hand movements in a natural task. Experimental Brain Research 139, 3 (8 2001), 266--277. https://doi.org/10.1007/s002210100745
[58]
Thammathip Piumsomboon, Arindam Dey, Barrett Ens, Gun Lee, and Mark Billinghurst. 2017. CoVAR: Mixed-Platform Remote Collaborative Augmented and Virtual Realities System with Shared Collaboration Cues. In Adjunct Proceedings of the 2017 IEEE International Symposium on Mixed and Augmented Reality, ISMAR-Adjunct 2017. Institute of Electrical and Electronics Engineers Inc., 218--219. https://doi.org/10.1109/ISMAR-Adjunct.2017.72
[59]
Thammathip Piumsomboon, Arindam Dey, Barrett Ens, Gun Lee, and Mark Billinghurst. 2019. The effects of sharing awareness cues in collaborative mixed reality. Frontiers Robotics AI 6, FEB (2019). https://doi.org/10.3389/frobt.2019. 00005
[60]
Thammathip Piumsomboon, Gun A. Lee, Jonathon D. Hart, Barrett Ens, Robert W. Lindeman, Bruce H. Thomas, and Mark Billinghurst. 2018. Mini-me: An adaptive avatar for Mixed Reality remote collaboration. Conference on Human Factors in Computing Systems - Proceedings 2018-April (2018), 1--13. https://doi.org/10.1145/3173574.3173620
[61]
Michael Prilla. 2019. "I simply watched where she was looking at": Coordination in short-term synchronous cooperative mixed reality. Proceedings of the ACMon Human-Computer Interaction 3, GROUP (2019). https://doi.org/10.1145/3361127
[62]
Yuan Yuan Qian and Robert J. Teather. 2017. The eyes don't have it. (2017), 91--98. https://doi.org/10.1145/3131277. 3132182
[63]
Rivu Radiah, Ville Mäkelä, Sarah Prange, Sarah Delgado Rodriguez, Robin Piening, Yumeng Zhou, Kay Köhle, Ken Pfeuffer, Yomna Abdelrahman, Matthias Hoppe, Albrecht Schmidt, and Florian Alt. 2021. Remote VR Studies: A Framework for Running Virtual Reality Studies Remotely Via Participant-Owned HMDs. ACM Transactions on Computer-Human Interaction 28, 6 (2021). https://doi.org/10.1145/3472617 arXiv:2102.11207
[64]
Patrick Reipschlager, Tamara Flemisch, and Raimund Dachselt. 2021. Personal augmented reality for information visualization on large interactive displays. IEEE Transactions on Visualization and Computer Graphics 27, 2 (2021), 1182--1192. https://doi.org/10.1109/TVCG.2020.3030460 arXiv:2009.03237
[65]
Sohrab Saeb, Cornelius Weber, and Jochen Triesch. 2011. Learning the optimal control of coordinated eye and head movements. PLoS Computational Biology 7, 11 (2011). https://doi.org/10.1371/journal.pcbi.1002253
[66]
David Saffo, Sara Di Bartolomeo, Caglar Yildirim, and Cody Dunne. 2021. Remote and collaborative virtual reality experiments via social vr platforms. Conference on Human Factors in Computing Systems - Proceedings (2021). https: //doi.org/10.1145/3411764.3445426
[67]
Bahador Saket, Alex Endert, and Ça?atay Demiralp. 2017. Task-Based Effectiveness of Basic Visualizations. arXiv 25, 7 (2017), 2505--2512.
[68]
C?n?k ?a?inka, Zden?k Stacho?, Michal Sedlák, Ji?í Chmelík, Luká? Herman, Petr Kubí?ek, Al?b?ta ?a?inková, Milan Dole?al, Hynek Tejkl, Tomá? Urbánek, Hana Svato?ová, Pavel Ugwitz, and Vojt?ch Ju?ík. 2019. Collaborative immersive virtual environments for education in geography. ISPRS International Journal of Geo-Information 8, 1 (2019). https://doi.org/10.3390/ijgi8010003
[69]
Kadek Ananta Satriadi, Barrett Ens, Maxime Cordeil, Tobias Czauderna, and Bernhard Jenny. 2020. Maps around Me: 3D Multiview Layouts in Immersive Spaces. Proceedings of the ACM on Human-Computer Interaction 4, ISS (2020). https://doi.org/10.1145/3427329
[70]
Jeffrey Schlimmer. 1985. Automobile data set. http://archive.ics.uci.edu/ml/datasets/Automobile.
[71]
Bertrand Schneider and Roy Pea. 2013. Real-time mutual gaze perception enhances collaborative learning and collaboration quality. International Journal of Computer-Supported Collaborative Learning 8, 4 (2013), 375--397. https://doi.org/10.1007/s11412-013--9181--4
[72]
Scott, S.; Johnson, K. 2020. Pros and cons of remote moderated testing: Considerations for ongoing research during COVID-19. https://boldinsight.com/pros-and-cons-of-remote-testing-considerations-for-ongoing-research-duringcovid-19/
[73]
Ludwig Sidenmark and Hans Gellersen. 2020. Eye, Head and Torso Coordination During Gaze Shifts in Virtual Reality. ACM Transactions on Computer-Human Interaction 27, 1 (1 2020), 1--40. https://doi.org/10.1145/3361218
[74]
Anthony Steed, Daniel Archer, Ben Congdon, Sebastian Friston, David Swapp, and Felix J. Thiel. 2021. Some Lessons Learned Running Virtual Reality Experiments Out of the Laboratory. (2021), 5--7. arXiv:2104.05359 http://arxiv.org/abs/2104.05359
[75]
Hamed R. Tavakoli, Esa Rahtu, Juho Kannala, and Ali Borji. 2019. Digging deeper into egocentric gaze prediction. In Proceedings - 2019 IEEE Winter Conference on Applications of Computer Vision, WACV 2019. Institute of Electrical and Electronics Engineers Inc., 273--282. https://doi.org/10.1109/WACV.2019.00035
[76]
Alvin R Tilley et al. 2002. Henry Dreyfuss Associates. The measure of man and woman: human factors in design (2002).
[77]
Jhon W Tukey. 1993. Análisis de datos exploratorios: pasado, presente y futuro. 0704 (1993), 1--61.
[78]
Roel Vertegaal, Ivo Weevers, Changuk Sohn, and Chris Cheung. 2003. GAZE-2: Conveying eye contact in group video conferencing using eye-controlled camera direction. Conference on Human Factors in Computing Systems - Proceedings 5 (2003), 521--528.
[79]
Peng Wang, Shusheng Zhang, Xiaoliang Bai, Mark Billinghurst, Weiping He, Shuxia Wang, Xiaokun Zhang, Jiaxiang Du, and Yongxing Chen. 2019. Head pointer or eye gaze: Which helps more in MR remote collaboration. In 26th IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2019 - Proceedings. Institute of Electrical and Electronics Engineers Inc., 1219--1220. https://doi.org/10.1109/VR.2019.8798024
[80]
Marcus RWatson, Benjamin Voloh, Christopher Thomas, Asif Hasan, and ThiloWomelsdorf. 2019. USE: An integrative suite for temporally-precise psychophysical experiments in virtual environments for human, nonhuman, and artificially intelligent agents. Journal of neuroscience methods 326 (2019), 108374.
[81]
NelsonWong and Carl Gutwin. 2010. Where are you pointing? The accuracy of deictic pointing in CVEs. In Conference on Human Factors in Computing Systems - Proceedings, Vol. 2. ACM Press, New York, New York, USA, 1029--1038. https://doi.org/10.1145/1753326.1753480
[82]
Nelson Wong and Carl Gutwin. 2014. Support for deictic pointing in CVEs. (2014), 1377--1387. https://doi.org/10.1145/2531602.2531691
[83]
Wanze Xie, Yining Liang, Janet Johnson, Andrea Mower, Samuel Burns, Colleen Chelini, Paul D'Alessandro, Nadir Weibel, and Jürgen P. Schulze. 2020. Interactive Multi-User 3D Visual Analytics in Augmented Reality. IS and T International Symposium on Electronic Imaging Science and Technology 2020, 13 (2020), 1--6. https://doi.org/10.2352/ ISSN.2470--1173.2020.13.ERVR-363
[84]
Difeng Yu and Xueshi Lu. 2021. Gaze-supported 3d object manipulation in virtual reality. Conference on Human Factors in Computing Systems - Proceedings (2021). https://doi.org/10.1145/3411764.3445343

Cited By

View all
  • (2024)Automatic Control of Virtual Cameras for Capturing and Sharing User Focus and Interaction in Collaborative Virtual RealitySymmetry10.3390/sym1602022816:2(228)Online publication date: 13-Feb-2024
  • (2024)Streamspace: A Framework for Window Streaming in Collaborative Mixed Reality Environments2024 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)10.1109/VRW62533.2024.00226(859-860)Online publication date: 16-Mar-2024
  • (2024)WindowMirror: An Opensource Toolkit to Bring Interactive Multi-Window Views into XR2024 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)10.1109/VRW62533.2024.00085(436-438)Online publication date: 16-Mar-2024
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image Proceedings of the ACM on Human-Computer Interaction
Proceedings of the ACM on Human-Computer Interaction  Volume 6, Issue CSCW2
CSCW
November 2022
8205 pages
EISSN:2573-0142
DOI:10.1145/3571154
Issue’s Table of Contents
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 11 November 2022
Published in PACMHCI Volume 6, Issue CSCW2

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. field of view frustum visualizations
  2. visual attention cues
  3. vr collaborative analytics

Qualifiers

  • Research-article

Funding Sources

  • ICASE award

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)159
  • Downloads (Last 6 weeks)12
Reflects downloads up to 18 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Automatic Control of Virtual Cameras for Capturing and Sharing User Focus and Interaction in Collaborative Virtual RealitySymmetry10.3390/sym1602022816:2(228)Online publication date: 13-Feb-2024
  • (2024)Streamspace: A Framework for Window Streaming in Collaborative Mixed Reality Environments2024 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)10.1109/VRW62533.2024.00226(859-860)Online publication date: 16-Mar-2024
  • (2024)WindowMirror: An Opensource Toolkit to Bring Interactive Multi-Window Views into XR2024 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)10.1109/VRW62533.2024.00085(436-438)Online publication date: 16-Mar-2024
  • (2024)Unveiling joint attention dynamicsComputers & Education10.1016/j.compedu.2024.105002213:COnline publication date: 25-Jun-2024
  • (2023)How Gaze Visualization Facilitates Initiation of Informal Communication in 3D Virtual SpacesACM Transactions on Computer-Human Interaction10.1145/361736831:1(1-32)Online publication date: 29-Nov-2023
  • (2023)Speech-Augmented Cone-of-Vision for Exploratory Data AnalysisProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581283(1-18)Online publication date: 19-Apr-2023

View Options

Login options

Full Access

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media