Abstract
Most augmented reality (AR) assembly guidance systems only utilize visual information. Regarding the sound, the human binaural effect helps users quickly identify the general direction of sound sources. At the same time, pleasant sounds can give people a sense of pleasure and relaxation. However, the effect on workers is still unknown when stereo sound and visual information are used together for assembly guidance. To assess the combination of sound and vision in AR assembly guidance, we constructed a stereo sound-assisted guidance system (SAG) based on AR. In our SAG system, we used the tone of a soft instrument called the Chinese lute as the sound source. To determine if SAG has an impact on assembly efficiency and user experience, we conducted a usability test to compare SAG with visual information alone. Results showed that the SAG system significantly improves the efficiency of assembly guidance. Moreover, simultaneous visual and auditory information processing does not increase user workload or learning difficulty. Additionally, in a noisy environment, pleasant sounds help to reduce mental strain.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Aouam D, Benbelkacem S, Zenati N, Zakaria S, Meftah Z (2018) Voice-based augmented reality interactive system for car’s components assembly. In 2018 3rd international conference on pattern analysis and intelligent systems (PAIS). IEEE, pp 1–5
Arbeláez JC, Viganò R, Osorio-Gómez G (2019) Haptic Augmented Reality (HapticAR) for assembly guidance. Int J Interact Des 13(2):673–687
Bertelson P, Radeau M (1981) Cross-modal bias and perceptual fusion with auditory-visual spatial discordance. Percept Psychophys 29:578–584
Berger CC, Gonzalez-Franco M, Tajadura-Jiménez A, Florencio D, Zhang Z (2018) Generic HRTFs may be good enough in virtual reality Improving source localization through cross-modal plasticity. Front Neurosci 12:21
Bradley JV (1958) Complete counterbalancing of immediate sequential effects in a Latin square design. J Am Stat Assoc 53(282):525–528
Brooke J (1996) SUS—a quick and dirty usability scale. Usability Eval Ind 189(194):4–7
Brooke J (2013) Sus: a retrospective. J Usability Stud 8(2):29–40
Bruscia KE (1998) Defining music therapy. Spring House Books, Spring City
Bode M (2019) Evaluation of an augmented reality assisted manufacturing system for assembly guidance (Master's thesis, University of Twente)
Caudell TP, Mizell DW Augmented reality: an application of heads-up display technology to manual manufacturing processes. In: Proceedings of the twenty-fifth Hawaii international conference on system sciences, 1992
Chen ZR, Liao CJ, Chu CH (2018) An assembly guidance system of tou kung based on augmented reality. Proc Caadria 2018:349–358
Colavita FB (1974) Human sensory dominance. Percept Psychophys 16:409–412
Danielsson O, Syberfeldt A, Holm M, Wang L Operators perspective on augmented reality as a support tool in engine assembly. In: CIRP CMS 2018, 2018
Devlin AS, Arneill AB (2016) Health care environments and patient outcomes a review of the literature. Environ Behav 35:665–694
Dhond UR, Aggarwal JK (1989) Structure from stereo-a review. IEEE Trans Syst Man Cybern 19:1489–1510
Di GQ, Lin QL, Zhao HH (2018) Guo YJ (2011) Proposed revision to emission limits of structure-borne noise from fixture transmitted into room: an investigation of people’s annoyance. Acta Acust United Acust 97:1034–1040
Dini G, Mura MDJPC (2015) Application of augmented reality techniques in through-life. Eng Serv 38:14–23
Eric E (1995) Searching for mood dependent memory. Psychol Sci 6(2):67–75
Estrada EG, Alva J (2013) Generalized shapiro-wilk test for multivariate normality. Retrieved from http://CRAN.R-project.org/package=mvShapiroTest
Guski R (1997) Psychological methods for evaluating sound quality and assessing acoustic information. Acta Acust United Acust 83:765–774
Guski R, Felscher-Suhr U, Schuemer R (1999) The concept of noise annoyance: how international experts see it. J Sound Vib 223:513–527
Hart SG, Staveland LE (1988) Development of nasa-tlx (task load index): results of empirical and theoretical research. Adv Psychol 52(6):139–183
Heinrich F, Schwenderling L, Joeres F et al (2020) Comparison of augmented reality display techniques to support medical needle insertion. In: 2020 IEEE international symposium on mixed and augmented reality, 2020
Hirose S, Kajikawa Y (2017) Effectiveness of headrest anc system with virtual sensing technique for factory noise. In 2017 Asia-Pacific signal and information processing association annual summit and conference. IEEE, 2017
Huang Y, Di G, Zhu Y, Hong Y, Zhang B (2008) Pair-wise comparison experiment on subjective annoyance rating of noise samples with different frequency spectrums but same A-weighted level. Appl Acoust 69:1205–1211
Hamidia M, Zenati N, Belghit H, Guetiteni K, Achour N (2016) Voice interaction using Gaussian Mixture Models for Augmented Reality applications. In: International conference on electrical engineering, 2016
Jakovljevic B, Paunovic K, Belojevic G (2009) Road-traffic noise and factors influencing noise annoyance in an urban population. Environ Int 35:552–556
Karen N (2015) Can music with prosocial lyrics heal the working world? A field intervention in a call center. J Appl Soc Psychol 45:132
Lam KC, Au WH (2008) Human response to a step change in noise exposure following the opening of a new railway extension in Hong Kong. Acta Acust United Acust 94:553–562
Lang M, Shaw DJ, Reddish P, Wallot S, Mitkidis P, Xygalatas D (2015) Lost in the rhythm: effects of rhythm on subsequent interpersonal coordination. Cogn Sci 40(7):1797–1815
Lehtinen V, Oulasvirta A, Salovaara A, Nurmi P Dynamic tactile guidance for visual search tasks. In: Proceedings of the 25th annual ACM symposium on user interface software and technology, 2012.
Lukas S, Koch PI (2010) Switching attention between modalities: further evidence for visual dominance. Psychol Res 74:255–267
McCraty R, Barrios-Choplin B, Atkinson M, Tomasino D (1998) The effects of different types of music on mood, tension, and mental clarity. Altern Ther Health Med 4(1):75–84
Nishihara A, Okamoto J. Object recognition in assembly assisted by augmented reality system. In: Sai intelligent systems conference, 2015.
Ologe FE, Olajide TG, Nwawolo CC, Oyejola BA (2008) Deterioration of noise-induced hearing loss among bottling factory workers. J Laryngol Otol 122(8):786–794
Ong SK, Pang Y, Nee AYC (2007) Augmented reality aided assembly design and planning. CIRP Ann 56(1):49–52
Pick HL, Warren DH, Hay JC (1969) Sensory conflict in judgments of spatial direction. Percept Psychophys 6:203–205
Pignatiello MF, Camp CJ, Rasar LA (1986) Musical mood induction: an alternative to the Velten technique. Abnorm Psychol 95:295–297
Posner MI, Nissen MJ, Klein RM (1976) Visual dominance: An information-processing account of its origins and significance. Psychol Rev 83:157–171
Ratcliffe E, Gatersleben B, Sowden PT (2013) Bird sounds and their contributions to perceived attention restoration and stress recovery. J Environ Psychol 36:221–228
Rauscher FH, Robinson KD, Jens JJ (1998) Improved maze learning through early music exposure in rats. Neurol Res 20(5):427–432
Rey D, Neuhäuser M (2011) Wilcoxon-signed-rank test. Springer, Berlin
Schmidt LA, Trainor LJ (2001) Frontal brain electrical activity (EEG) distinguishes valence and intensity of musical emotions. Cogn Emot 15:487–500
Scurati GW, Gattullo M, Fiorentino M, Ferrise F, Bordegoni M, Uva AE (2018) Converting maintenance actions into standard symbols for Augmented Reality applications in Industry 4.0. Comput Ind 98:68–79
Serge SR, Fragomeni G (2017) Assessing the relationship between type of head movement and simulator sickness using an immersive virtual reality head mounted display: a pilot study. In: International conference on virtual, 2017
Sheldon A, Dobbs T, Fabbri A, Gardner N, Haeusler MH, Ramos C, Zavoleas Y (2019) Putting the AR in (AR) chitecture-integrating voice recognition and gesture control for Augmented Reality interaction to enhance design practice. In the 24th CAADRIA Conference. CAADRIA, 2019
Sloboda JA (1999) Music: where cognition and emotion meet. Psychologist 12:450–455
Sound Retrieval System. https://en.wikipedia.org/wiki/Sound_Retrieval_System
Sun F, Li X (2021) AR-voice interaction research based on factor analysis. In Twelfth international conference on graphics and image processing (ICGIP 2020). SPIE, 2021
Sun M, He W, Zhang L, Wang P (2019) Smart haproxy: a novel vibrotactile feedback prototype combining passive and active haptic in AR interaction. In 2019 IEEE international symposium on mixed and augmented reality adjunct. IEEE, 2019
Tang A, Owen CB, Biocca F, Mou W (2003) Comparative effectiveness of augmented reality in object assembly. In: Conference on human factors in computing systems, 2003.
Unity. https://unity.com/
Vovk A, Wild F, Guest W, Kuula T (2018) Simulator sickness in augmented reality training using the Microsoft HoloLens. In: Proceedings of the 2018 CHI conference on human factors in computing systems, 2018
Vuforia Engine. https://developer.vuforia.com/
Wang X, Ong SK, Nee AYC (2016a) A comprehensive survey of augmented reality assembly research. Adv Manuf 1:1–22
Wang X, Ong SK, Nee AYC (2016) Multimodal augmented-reality assembly guidance based on bare-hand interface. Elsevier, New York
Wang Z, Bai X, Zhang S, He W, Zhang X, Zhang L, Wang P, Han D, Yan Y (2020) Information-level AR instruction: a novel assembly guidance information representation assisting user cognition. Int J Adv Manuf Technol 106(1):603–626
Webel S, Bockholt U, Engelke T, Gavish N, Olbrich M, Preusche C (2013) An augmented reality training platform for assembly and maintenance skills. Robot Auton Syst 61:398–403
Wiesenthal DL, Hennessy DA, Totten B (2000) The influence of music on driver stress. J Appl Soc Psychol 30:1709–1719
Wilcox RR (2003) One-way anova. Applying contemporary statistical. Techniques 6(1):285–328
Williamson V (2014) You are the music: how music reveals what it means to be human. University of California Press, California
Zhang J, Ong SK, Nee AYC (2011) RFID-assisted assembly guidance system in an augmented reality environment. Int J Prod Res 49(13):3919–3938
Zhao J, Parry CJ, dos Anjos, R, Anslow C, Rhee T. Voice interaction for augmented reality navigation interfaces with natural language understanding. In 2020 35th international conference on image and vision computing New Zealand. IEEE, 2020
Acknowledgements
This work is partly supported by the National Key R&D Program of China (Grant No. 2019YFB1703800).
Author information
Authors and Affiliations
Corresponding authors
Ethics declarations
Ethical standards
This article does not address conflicts of interest. This paper includes usability tests, and the usability tests involved are subject to my consent. In addition, all authors are informed and agree.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Feng, S., He, X., He, W. et al. Can you hear it? Stereo sound-assisted guidance in augmented reality assembly. Virtual Reality 27, 591–601 (2023). https://doi.org/10.1007/s10055-022-00680-0
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10055-022-00680-0