Abstract
In AR operation guidance training, for assembly parts with similar geometric shapes, there are still two problems in the visual representation of AR instructions: (1) AR instructions cannot accurately represent the micro-geometric differences between similar parts. (2) The AR instruction design specification that reflects the micro-geometric differences of similar parts has not been formulated. Based on such a problem, our team has carried out the following research work: First, the geometric features of parts are defined at the micro-geometric level and micro-information level, thereby explaining the relationships and differences between similar part features. From the above two levels. Secondly, a mathematical model of the geometric features of the parts is established, and the control parameters in the model are given to characterize the feature differences between similar parts. To verify the accuracy of the control parameters, we designed three AR instructions based on the control parameters and verified them through five hypotheses. Our team then analyzed the data from a case study and focused our discussion on test results that did not meet expectations. We have a more in-depth discussion by comparing the differences and analyzing the results. Finally, three implications of AR instructions in representing feature differences between similar parts are given, and future research directions for such work are indicated. It is provided guidance for future research.
Similar content being viewed by others
Data Availability
Not applicable.
Code availability
Not applicable.
References
Almiyad MA, Oakden-Rayner L, Weerasinghe A, Billinghurst M (2017) "Intelligent Augmented Reality Tutoring for Physical Tasks with Medical Professionals," in International Conference on Artificial Intelligence in Education: Springer, pp. 450–454
Ceruti A, Marzocca P, Liverani A, Bil C (2019) Maintenance in aeronautics in an industry 4.0 context: the role of augmented reality and additive manufacturing. Journal of Computational Design and Engineering 6(4):516–526
Chu C-H, Ko C-H (2021) An experimental study on augmented reality assisted manual assembly with occluded components. J Manuf Syst 61:685–695
Feng S, He W, Zhang S, Billinghurst M (2022) Seeing is believing: AR-assisted blind area assembly to support hand–eye coordination. Int J Adv Manuf Technol 119:1–10
Fiorentino M, Monno G, Uva A (2009) Tangible digital master for product lifecycle management in augmented reality. International Journal on Interactive Design and Manufacturing (IJIDeM) 3(2):121–129
Gattullo M, Scurati GW, Fiorentino M, Uva AE, Ferrise F, Bordegoni M (2019) Towards augmented reality manuals for industry 4.0: A methodology. Robot Comput Integr Manuf 56:276–286
Henderson SJ, Feiner SK (2011) "Augmented reality in the psychomotor phase of a procedural task," in 2011 10th IEEE International Symposium on Mixed and Augmented Reality: IEEE, pp. 191–200
Henderson S, Feiner S (2011) Exploring the benefits of augmented reality documentation for maintenance and repair. IEEE Trans Vis Comput Graph 17(10):1355–1368
Huang JM, Ong SK, Nee AYC (2016) "Visualization and interaction of finite element analysis in augmented reality," Comput Aided Des, vol. 84
Kaplan AD, Cruit J, Endsley M, Beers SM, Sawyer BD, Hancock PA (2021) The effects of virtual reality, augmented reality, and mixed reality as training enhancement methods: a meta-analysis. Human Factors 63(4):706–726. https://doi.org/10.1177/0018720820904229
Lai Z-H, Tao W, Leu MC, Yin Z (2020) Smart augmented reality instructional system for mechanical assembly towards worker-centered intelligent manufacturing. J Manuf Syst 55:69–81. https://doi.org/10.1016/j.jmsy.2020.02.010
Laviola E, Gattullo M, Manghisi VM, Fiorentino M, Uva AE (2022) Minimal AR: visual asset optimization for the authoring of augmented reality work instructions in manufacturing. Int J Adv Manufacturing Technol 119(3):1769–1784. https://doi.org/10.1007/s00170-021-08449-6
Liu C, Cao S, Tse W, Xu X (2017) Augmented reality-assisted intelligent window for cyber-physical machine tools. J Manuf Syst 44:280–286
Nurelmadina N et al (2021) A Systematic Review on Cognitive Radio in Low Power Wide Area Network for Industrial IoT Applications. Sustainability 13:1. https://doi.org/10.3390/su13010338
Raji MF et al (2020) A New Approach for Enhancing the Services of the 5G Mobile Network and IOT-Related Communication Devices Using Wavelet-OFDM and Its Applications in Healthcare. Sci Programm 2020:3204695. https://doi.org/10.1155/2020/3204695
Ramachandran et al. (2019) "Mini-Me: An Adaptive Avatar for Mixed Reality Remote Collaboration," Sci Rep
J.-e. Shin, B. Yoon, D. Kim, and W. Woo, "A User-Oriented Approach to Space-Adaptive Augmentation: The Effects of Spatial Affordance on Narrative Experience in an Augmented Reality Detective Game," presented at the Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, Yokohama, Japan, 2021. [Online]. Available: https://doi.org/10.1145/3411764.3445675.
Siew C, Ong S, Nee A (2019) A practical augmented reality-assisted maintenance system framework for adaptive user support. Robot Comput Integr Manuf 59:115–129
Urbas U, Vrabič R, Vukašinović N (2019) Displaying Product Manufacturing Information in Augmented Reality for Inspection. Procedia CIRP 81:832–837. https://doi.org/10.1016/j.procir.2019.03.208
Vanneste P, Huang Y, Park JY, Cornillie F, Decloedt B, Van den Noortgate W (2020) Cognitive support for assembly operations by means of augmented reality: an exploratory study. Int J Human-Comput Studies 143:102480. https://doi.org/10.1016/j.ijhcs.2020.102480
Wang Z, Bai X, Zhang S, He W, Zhang X, Zhang L, Wang P, Han D, Yan Y (2019) Information-level AR instruction: a novel assembly guidance information representation assisting user cognition. Int J Adv Manuf Technol 106:1–24
Z. Wang, Y. Yan, D. Han, X. Bai, and S. Zhang, "Product Blind Area Assembly Method Based on Augmented Reality and Machine Vision," JNWPU, 10.1051/jnwpu/20193730496 vol. 37, no. 3, pp. 496–502, // 2019. [Online]. Available: https://doi.org/10.1051/jnwpu/20193730496.
Wang Z, Bai X, Zhang S, He W, Zhang X, Zhang L, Wang P, Han D, Yan Y (2020) Information-level AR instruction: a novel assembly guidance information representation assisting user cognition. Int J Adv Manuf Technol 106(1):603–626
Wang Z et al (2020) Information-level real-time AR instruction: a novel dynamic assembly guidance information representation assisting human cognition. Int J Adv Manuf Technol:1–19
Wang Z et al. (2020) "SHARIdeas: A Visual Representation of Intention Sharing Between Designer and Executor Supporting AR Assembly," in SIGGRAPH Asia 2020 Posters, pp. 1–2
Wang Z et al (2021) M-AR: A Visual Representation of Manual Operation Precision in AR Assembly. Int J Human–Comput Interact 37(19):1799–1814. https://doi.org/10.1080/10447318.2021.1909278
Wang P, Bai X, Billinghurst M, Zhang S, Wei S, Xu G, He W, Zhang X, Zhang J (2021) 3DGAM: using 3D gesture and CAD models for training on mixed reality remote collaboration. Multimed Tools Appl 80(20):31059–31084
Wang Z et al (2021) SHARIDEAS: a smart collaborative assembly platform based on augmented reality supporting assembly intention recognition. Int J Adv Manufacturing Technol 115(1):475–486. https://doi.org/10.1007/s00170-021-07142-y
Westerfield G, Mitrovic A, Billinghurst M (2015) Intelligent augmented reality training for motherboard assembly. Int J Artif Intell Educ 25(1):157–172
Acknowledgments
We would like to appreciate the anonymous reviewers for their constructive suggestions for enhancing this paper. Besides, thanks to Zhishuo Xiong of the London School of Economics and Political Science for checking the English manuscript of the earlier version and he helped the author correct the grammatical errors in the paper. We particularly thank the CPILab VR / AR team of northwestern polytechnical university for its contribution to this study. We would also like to thank volunteers of university of shanghai for science and technology for participating in this experiment.
Funding
This work is partly supported by Defense Industrial Technology Development Program(No. XXXX2018213A001) and SASTIND China under Grant (JCKY2018205B021).
Author information
Authors and Affiliations
Contributions
Yang Wang provided some valuable design solutions for this UX experiment. Jie Zhang and Yueqing Zhang established the basic hardware environment for our research. Yuxiang Yan broke through the technical difficulty of this research, and Xiangyu Zhang did a lot of work for the collection of experimental data. In particular, we would like to thank Prof. Weiping He and Associate Prof. Xiaoliang Bai for their constructive comments on the improvement of the experiment.
Corresponding author
Ethics declarations
Conflicts of interests/competing interests
Our team declare that we have no financial and personal relationships with other people or organizations that can inappropriately influence our work, there is no professional or other personal interest of any nature or kind in any product, service and/or company that could be construed as influencing the position presented in, or the review of, the manuscript entitled.
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Wang, Z., Wang, Y., Bai, X. et al. Micro-information-level AR instruction: a new visual representation supporting manual classification of similar assembly parts. Multimed Tools Appl 82, 11589–11618 (2023). https://doi.org/10.1007/s11042-022-13574-9
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11042-022-13574-9