Nothing Special   »   [go: up one dir, main page]

Skip to main content
Log in

Can you hear it? Stereo sound-assisted guidance in augmented reality assembly

  • Original Article
  • Published:
Virtual Reality Aims and scope Submit manuscript

Abstract

Most augmented reality (AR) assembly guidance systems only utilize visual information. Regarding the sound, the human binaural effect helps users quickly identify the general direction of sound sources. At the same time, pleasant sounds can give people a sense of pleasure and relaxation. However, the effect on workers is still unknown when stereo sound and visual information are used together for assembly guidance. To assess the combination of sound and vision in AR assembly guidance, we constructed a stereo sound-assisted guidance system (SAG) based on AR. In our SAG system, we used the tone of a soft instrument called the Chinese lute as the sound source. To determine if SAG has an impact on assembly efficiency and user experience, we conducted a usability test to compare SAG with visual information alone. Results showed that the SAG system significantly improves the efficiency of assembly guidance. Moreover, simultaneous visual and auditory information processing does not increase user workload or learning difficulty. Additionally, in a noisy environment, pleasant sounds help to reduce mental strain.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

References

  • Aouam D, Benbelkacem S, Zenati N, Zakaria S, Meftah Z (2018) Voice-based augmented reality interactive system for car’s components assembly. In 2018 3rd international conference on pattern analysis and intelligent systems (PAIS). IEEE, pp 1–5

  • Arbeláez JC, Viganò R, Osorio-Gómez G (2019) Haptic Augmented Reality (HapticAR) for assembly guidance. Int J Interact Des 13(2):673–687

    Article  Google Scholar 

  • Bertelson P, Radeau M (1981) Cross-modal bias and perceptual fusion with auditory-visual spatial discordance. Percept Psychophys 29:578–584

    Article  Google Scholar 

  • Berger CC, Gonzalez-Franco M, Tajadura-Jiménez A, Florencio D, Zhang Z (2018) Generic HRTFs may be good enough in virtual reality Improving source localization through cross-modal plasticity. Front Neurosci 12:21

    Article  Google Scholar 

  • Bradley JV (1958) Complete counterbalancing of immediate sequential effects in a Latin square design. J Am Stat Assoc 53(282):525–528

    Article  MATH  Google Scholar 

  • Brooke J (1996) SUS—a quick and dirty usability scale. Usability Eval Ind 189(194):4–7

    Google Scholar 

  • Brooke J (2013) Sus: a retrospective. J Usability Stud 8(2):29–40

    Google Scholar 

  • Bruscia KE (1998) Defining music therapy. Spring House Books, Spring City

    Google Scholar 

  • Bode M (2019) Evaluation of an augmented reality assisted manufacturing system for assembly guidance (Master's thesis, University of Twente)

  • Caudell TP, Mizell DW Augmented reality: an application of heads-up display technology to manual manufacturing processes. In: Proceedings of the twenty-fifth Hawaii international conference on system sciences, 1992

  • Chen ZR, Liao CJ, Chu CH (2018) An assembly guidance system of tou kung based on augmented reality. Proc Caadria 2018:349–358

  • Colavita FB (1974) Human sensory dominance. Percept Psychophys 16:409–412

    Article  Google Scholar 

  • Danielsson O, Syberfeldt A, Holm M, Wang L Operators perspective on augmented reality as a support tool in engine assembly. In: CIRP CMS 2018, 2018

  • Devlin AS, Arneill AB (2016) Health care environments and patient outcomes a review of the literature. Environ Behav 35:665–694

    Article  Google Scholar 

  • Dhond UR, Aggarwal JK (1989) Structure from stereo-a review. IEEE Trans Syst Man Cybern 19:1489–1510

    Article  MathSciNet  Google Scholar 

  • Di GQ, Lin QL, Zhao HH (2018) Guo YJ (2011) Proposed revision to emission limits of structure-borne noise from fixture transmitted into room: an investigation of people’s annoyance. Acta Acust United Acust 97:1034–1040

    Article  Google Scholar 

  • Dini G, Mura MDJPC (2015) Application of augmented reality techniques in through-life. Eng Serv 38:14–23

    Google Scholar 

  • Eric E (1995) Searching for mood dependent memory. Psychol Sci 6(2):67–75

  • Estrada EG, Alva J (2013) Generalized shapiro-wilk test for multivariate normality. Retrieved from http://CRAN.R-project.org/package=mvShapiroTest

  • Guski R (1997) Psychological methods for evaluating sound quality and assessing acoustic information. Acta Acust United Acust 83:765–774

    Google Scholar 

  • Guski R, Felscher-Suhr U, Schuemer R (1999) The concept of noise annoyance: how international experts see it. J Sound Vib 223:513–527

    Article  Google Scholar 

  • Hart SG, Staveland LE (1988) Development of nasa-tlx (task load index): results of empirical and theoretical research. Adv Psychol 52(6):139–183

  • Heinrich F, Schwenderling L, Joeres F et al (2020) Comparison of augmented reality display techniques to support medical needle insertion. In: 2020 IEEE international symposium on mixed and augmented reality, 2020

  • Hirose S, Kajikawa Y (2017) Effectiveness of headrest anc system with virtual sensing technique for factory noise. In 2017 Asia-Pacific signal and information processing association annual summit and conference. IEEE, 2017

  • Huang Y, Di G, Zhu Y, Hong Y, Zhang B (2008) Pair-wise comparison experiment on subjective annoyance rating of noise samples with different frequency spectrums but same A-weighted level. Appl Acoust 69:1205–1211

    Article  Google Scholar 

  • Hamidia M, Zenati N, Belghit H, Guetiteni K, Achour N (2016) Voice interaction using Gaussian Mixture Models for Augmented Reality applications. In: International conference on electrical engineering, 2016

  • Jakovljevic B, Paunovic K, Belojevic G (2009) Road-traffic noise and factors influencing noise annoyance in an urban population. Environ Int 35:552–556

    Article  Google Scholar 

  • Karen N (2015) Can music with prosocial lyrics heal the working world? A field intervention in a call center. J Appl Soc Psychol 45:132

    Article  Google Scholar 

  • Lam KC, Au WH (2008) Human response to a step change in noise exposure following the opening of a new railway extension in Hong Kong. Acta Acust United Acust 94:553–562

    Article  Google Scholar 

  • Lang M, Shaw DJ, Reddish P, Wallot S, Mitkidis P, Xygalatas D (2015) Lost in the rhythm: effects of rhythm on subsequent interpersonal coordination. Cogn Sci 40(7):1797–1815

  • Lehtinen V, Oulasvirta A, Salovaara A, Nurmi P Dynamic tactile guidance for visual search tasks. In: Proceedings of the 25th annual ACM symposium on user interface software and technology, 2012.

  • Lukas S, Koch PI (2010) Switching attention between modalities: further evidence for visual dominance. Psychol Res 74:255–267

    Article  Google Scholar 

  • McCraty R, Barrios-Choplin B, Atkinson M, Tomasino D (1998) The effects of different types of music on mood, tension, and mental clarity. Altern Ther Health Med 4(1):75–84

    Google Scholar 

  • Nishihara A, Okamoto J. Object recognition in assembly assisted by augmented reality system. In: Sai intelligent systems conference, 2015.

  • Ologe FE, Olajide TG, Nwawolo CC, Oyejola BA (2008) Deterioration of noise-induced hearing loss among bottling factory workers. J Laryngol Otol 122(8):786–794

    Article  Google Scholar 

  • Ong SK, Pang Y, Nee AYC (2007) Augmented reality aided assembly design and planning. CIRP Ann 56(1):49–52

    Article  Google Scholar 

  • Pick HL, Warren DH, Hay JC (1969) Sensory conflict in judgments of spatial direction. Percept Psychophys 6:203–205

    Article  Google Scholar 

  • Pignatiello MF, Camp CJ, Rasar LA (1986) Musical mood induction: an alternative to the Velten technique. Abnorm Psychol 95:295–297

    Article  Google Scholar 

  • Posner MI, Nissen MJ, Klein RM (1976) Visual dominance: An information-processing account of its origins and significance. Psychol Rev 83:157–171

    Article  Google Scholar 

  • Ratcliffe E, Gatersleben B, Sowden PT (2013) Bird sounds and their contributions to perceived attention restoration and stress recovery. J Environ Psychol 36:221–228

    Article  Google Scholar 

  • Rauscher FH, Robinson KD, Jens JJ (1998) Improved maze learning through early music exposure in rats. Neurol Res 20(5):427–432

  • Rey D, Neuhäuser M (2011) Wilcoxon-signed-rank test. Springer, Berlin

    Book  Google Scholar 

  • Schmidt LA, Trainor LJ (2001) Frontal brain electrical activity (EEG) distinguishes valence and intensity of musical emotions. Cogn Emot 15:487–500

    Article  Google Scholar 

  • Scurati GW, Gattullo M, Fiorentino M, Ferrise F, Bordegoni M, Uva AE (2018) Converting maintenance actions into standard symbols for Augmented Reality applications in Industry 4.0. Comput Ind 98:68–79

  • Serge SR, Fragomeni G (2017) Assessing the relationship between type of head movement and simulator sickness using an immersive virtual reality head mounted display: a pilot study. In: International conference on virtual, 2017

  • Sheldon A, Dobbs T, Fabbri A, Gardner N, Haeusler MH, Ramos C, Zavoleas Y (2019) Putting the AR in (AR) chitecture-integrating voice recognition and gesture control for Augmented Reality interaction to enhance design practice. In the 24th CAADRIA Conference. CAADRIA, 2019

  • Sloboda JA (1999) Music: where cognition and emotion meet. Psychologist 12:450–455

    Google Scholar 

  • Sound Retrieval System. https://en.wikipedia.org/wiki/Sound_Retrieval_System

  • Sun F, Li X (2021) AR-voice interaction research based on factor analysis. In Twelfth international conference on graphics and image processing (ICGIP 2020). SPIE, 2021

  • Sun M, He W, Zhang L, Wang P (2019) Smart haproxy: a novel vibrotactile feedback prototype combining passive and active haptic in AR interaction. In 2019 IEEE international symposium on mixed and augmented reality adjunct. IEEE, 2019

  • Tang A, Owen CB, Biocca F, Mou W (2003) Comparative effectiveness of augmented reality in object assembly. In: Conference on human factors in computing systems, 2003.

  • Unity. https://unity.com/

  • Vovk A, Wild F, Guest W, Kuula T (2018) Simulator sickness in augmented reality training using the Microsoft HoloLens. In: Proceedings of the 2018 CHI conference on human factors in computing systems, 2018

  • Vuforia Engine. https://developer.vuforia.com/

  • Wang X, Ong SK, Nee AYC (2016a) A comprehensive survey of augmented reality assembly research. Adv Manuf 1:1–22

    Article  Google Scholar 

  • Wang X, Ong SK, Nee AYC (2016) Multimodal augmented-reality assembly guidance based on bare-hand interface. Elsevier, New York

    Google Scholar 

  • Wang Z, Bai X, Zhang S, He W, Zhang X, Zhang L, Wang P, Han D, Yan Y (2020) Information-level AR instruction: a novel assembly guidance information representation assisting user cognition. Int J Adv Manuf Technol 106(1):603–626

    Article  Google Scholar 

  • Webel S, Bockholt U, Engelke T, Gavish N, Olbrich M, Preusche C (2013) An augmented reality training platform for assembly and maintenance skills. Robot Auton Syst 61:398–403

    Article  Google Scholar 

  • Wiesenthal DL, Hennessy DA, Totten B (2000) The influence of music on driver stress. J Appl Soc Psychol 30:1709–1719

    Article  Google Scholar 

  • Wilcox RR (2003) One-way anova. Applying contemporary statistical. Techniques 6(1):285–328

    Google Scholar 

  • Williamson V (2014) You are the music: how music reveals what it means to be human. University of California Press, California

  • Zhang J, Ong SK, Nee AYC (2011) RFID-assisted assembly guidance system in an augmented reality environment. Int J Prod Res 49(13):3919–3938

    Article  Google Scholar 

  • Zhao J, Parry CJ, dos Anjos, R, Anslow C, Rhee T. Voice interaction for augmented reality navigation interfaces with natural language understanding. In 2020 35th international conference on image and vision computing New Zealand. IEEE, 2020

Download references

Acknowledgements

This work is partly supported by the National Key R&D Program of China (Grant No. 2019YFB1703800).

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Shuo Feng or Weiping He.

Ethics declarations

Ethical standards

This article does not address conflicts of interest. This paper includes usability tests, and the usability tests involved are subject to my consent. In addition, all authors are informed and agree.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Feng, S., He, X., He, W. et al. Can you hear it? Stereo sound-assisted guidance in augmented reality assembly. Virtual Reality 27, 591–601 (2023). https://doi.org/10.1007/s10055-022-00680-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10055-022-00680-0

Keywords

Navigation