Nothing Special   »   [go: up one dir, main page]

Skip to main content
Log in

FPSI-Fingertip pose and state-based natural interaction techniques in virtual environments

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

Simple and natural interaction has a vital role in any realistic virtual environment (VE). This research proposes a set of lightweight gesture-based techniques for interaction in VEs with a focus on high accuracy, performance, and usability. The proposed techniques use a single fingertip pose and state for object/task selection, translation, navigation, rotation, and scaling. Four different techniques are proposed for interaction, i.e., MSGE (Menu-based task selection and gesture-based task execution), GSGE (Gesture-based task selection and gesture-based task execution), SGTE (Single gesture for task selection and execution), and TSGE (Time slice-based task selection and gesture-based task execution). Keeping in mind the concept of re-usability, the index-tip spatial position is used for task operation in all techniques. For experimental evaluation of the proposed techniques, a VE is designed in Unity3D, while interaction is carried out using the Leap Motion controller. The experimental study was conducted with forty (40) volunteer participants and two experts (authors). Experimental results show improved accuracy for TSGE (participants 97.22%, and experts 97.22%) as compared to others (participants: SGTE 95.55%, GSGE 94.44%, and MSGE 92.75%, experts: SGTE 94.44%, GSGE 94.44%, and MSGE 91.67%). Similarly, the results show high task performance for TSGE (participants 112.9 seconds, SD 5.3, experts 101.75 seconds, SD 3.3) as compared to others (participants: SGTE 117.2 seconds, SD 5.7, GSGE 121.8 seconds, SD 8.0, and MSGE 126.7 seconds, SD 12.9 and experts: SGTE 107.0, SD 5.7, GSGE 113.25, SD 3.5, and MSGE 122.0 seconds, SD 3.6). In addition, usability analysis shows high usability for the proposed interaction techniques, i.e., TSGE (SUS score 98.5), SGTE (SUS score 95.75), GSGE (SUS score 95.25), MSGE (SUS score 94.75). Furthermore, a comparative study with state-of-the-art interaction techniques showed a high accuracy rate, multiple tasks, and reusability support, use of easy to learn and use and fewer features-based gestures (fingertip gestures), and multiple interaction techniques (four techniques) support for the proposed techniques.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Algorithm 1
Fig. 6
Fig. 7
Algorithm 2
Fig. 8
Fig. 9
Algorithm 3
Fig. 10
Algorithm 4
Fig. 11
Algorithm 5
Algorithm 6
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18

Similar content being viewed by others

References

  1. Agarwal C, Dogra DP, Saini R, Roy PP (2015) Segmentation and recognition of text written in 3d using leap motion interface. In: 2015 3rd IAPR Asian Conference on Pattern Recognition (ACPR), pp 539–543. IEEE

  2. Bachmann D, Weichert F, Rinkenauer G (2018) Review of three-dimensional human-computer interaction with focus on the leap motion controller. Sensors 18(7):2194

    Article  Google Scholar 

  3. Beattie N, Horan B, McKenzie S (2015) Taking the leap with the oculus hmd and cad-plucking at thin air? Procedia Technol 20:149–154

    Article  Google Scholar 

  4. Brooke J et al (1996) Sus-a quick and dirty usability scale. Usability evaluation in industry 189(194):4–7

    Google Scholar 

  5. Buchmann V, Violich S, Billinghurst M, Cockburn A (2004) Fingartips: gesture based direct manipulation in augmented reality. In: Proceedings of the 2nd international conference on computer graphics and interactive techniques in Australasia and South East Asia, pp 212–221

  6. Caggianese G, Gallo L, Neroni P (2016) An investigation of leap motion based 3d manipulation techniques for use in egocentric viewpoint. In: International conference on augmented reality, virtual reality and computer graphics, pp 318–330. Springer

  7. Chen Q, Rahman AM, El-Sawah A, Shen X, El Saddik A, Georganas ND, DISCOVER M (2006) Accessing learning objects in virtual environment by hand gestures and voice. In: Proc. 3rd Annual scientific conference of LORNET research network (I2LOR-06). Citeseer

  8. Chifor M, Stefanut T (2015) Immersive virtual reality application using google cardboard and leap motion technologies. In: RoCHI, pp 115–120

  9. Cissé K, Gandhi A, Lottridge D, Amor R (2020) User elicited hand gestures for vr-based navigation of architectural designs. In: 2020 IEEE Symposium on Visual Languages and Human-Centric Computing (VL/HCC), pp 1–5. IEEE

  10. Cohen J (1988) Statistical power analysis for the behavioural sciences. Hillsdale, NJ: Laurence Erlbaum Associates Inc

  11. de Paiva Batista RM (2016) Navigating virtual reality worlds with the leap motion controller

  12. Dong Y, Liu J, Yan W (2021) Dynamic hand gesture recognition based on signals from specialized data glove and deep learning algorithms. IEEE Trans Instrum Meas 70:1–14

    Google Scholar 

  13. Gonizzi Barsanti S, Caruso G, Micoli L, Covarrubias Rodriguez M, Guidi G et al (2015) 3d visualization of cultural heritage artefacts with virtual reality devices. In: 25th International CIPA Symposium 2015, vol 40, pp 165–172. Copernicus Gesellschaft mbH

  14. Hale K, Stanney K (2015) Handbook of virtual environments: Design. Implementation and Applications

  15. Han J, Gold N (2014) Lessons learned in exploring the leap motion™sensor for gesture-based instrument design. Goldsmiths University of London

  16. Hernández B, Flores A (2014) A bare-hand gesture interaction system for virtual environments. In: 2014 International conference on computer graphics theory and applications (GRAPP), pp 1–8. IEEE

  17. Hua J, Qin H (2001) Haptic sculpting of volumetric implicit functions. In: Proceedings 9th Pacific conference on computer graphics and applications. Pac Graph 2001, pp 254–264. IEEE

  18. Huang R, Harris-Adamson C, Odell D, Rempel D (2019) Design of finger gestures for locomotion in virtual reality. Virtual Reality Intell Hardw 1 (1):1–9

    Article  Google Scholar 

  19. Ikram A, Liu Y (2021) Real time hand gesture recognition using leap motion controller based on cnn-svm architechture. In: 2021 IEEE 7th International Conference on Virtual Reality (ICVR), pp 5–9. IEEE

  20. Juan W (2021) Gesture recognition and information recommendation based on machine learning and virtual reality in distance education. J Intell Fuzzy Syst 40(4):7509–7519

    Article  Google Scholar 

  21. Katsuragawa K, Kamal A, Lank E (2017) Effect of motion-gesture recognizer error pattern on user workload and behavior. In: Proceedings of the 22nd International conference on intelligent user interfaces, pp 439–449

  22. Kerefeyn S, Maleshkov S (2015) Manipulation of virtual objects through a leapmotion optical sensor. Int J Comput Sci Issues (IJCSI) 12(5):52

    Google Scholar 

  23. Kharoub H, Lataifeh M, Ahmed N (2019) 3d user interface design and usability for immersive vr. Appl Sci 9(22):4861

    Article  Google Scholar 

  24. Khundam C (2015) First person movement control with palm normal and hand gesture interaction in virtual reality. In: 2015 12th International Joint Conference on Computer Science and Software Engineering (JCSSE), pp. 325–330. IEEE

  25. Kim K, Kim J, Choi J, Kim J, Lee S (2015) Depth camera-based 3d hand gesture controls with immersive tactile feedback for natural mid-air gesture interactions. Sensors 15(1):1022–1046

    Article  Google Scholar 

  26. Kim J-S, Park K-H, Kim J-B, Do J-H, Song K-J, Bien Z (2000) Study on intelligent autonomous navigation of avatar using hand gesture recognition. In: Smc 2000 Conference proceedings. 2000 Ieee International conference on systems, man and cybernetics.’cybernetics evolving to systems, humans, organizations, and their Complex interactions’cat. No. 0, vol 2, pp 846–851. IEEE

  27. Kiyokawa K, Takemura H, Katayama Y, Iwasa H, Yokoya N (1996) Vlego: a simple two-handed modeling environment based on toy blocks. In: Proceedings of the ACM Symposium on virtual reality software and technology, pp 27–34

  28. Kommalapati R, Michmizos KP (2016) Virtual reality for pediatric neuro-rehabilitation: Adaptive visual feedback of movement to engage the mirror neuron system. In: 2016 38th Annual International Conference of the Ieee Engineering in Medicine and Biology Society (EMBC), pp 5849–5852. IEEE

  29. Kytö M, Dhinakaran K, Martikainen A, Hämäläinen P (2015) Improving 3d character posing with a gestural interface. IEEE Comput Graph Appl 37(1):70–78

    Article  Google Scholar 

  30. Lee C-S, Choi J-D, Oh K-M, Park C-J (1999) Hand interface for immersive virtual environment authoring system. In: Proceedings of the International conference on virtual systems and multimedia, pp 361–366. Citeseer

  31. Lewis JJR, Sauro J (2017) Revisiting the factor structure of the system usability scale. J Usability Stud 12(4)

  32. Lin W, Du L, Harris-Adamson C, Barr A, Rempel D (2017) Design of hand gestures for manipulating objects in virtual reality. In: International conference on human-computer interaction, pp 584–592. Springer

  33. Moore AG, Howell MJ, Stiles AW, Herrera NS, McMahan RP (2015) Wedge: A musical interface for building and playing composition-appropriate immersive environments. In: 2015 IEEE Symposium on 3D user interfaces (3DUI), pp 205–206. IEEE

  34. Morse P, Reading A, Lueg C, Kenderdine S (2015) Taggervr: interactive data analytics for geoscience-a novel interface for interactive visual analytics of large geoscientific datasets in cloud repositories. In: 2015 Big Data Visual Analytics (BDVA), pp 1–2. IEEE

  35. Mousas C, Anagnostopoulos C-N (2017) Real-time performance-driven finger motion synthesis. Comput Graph 65:1–11

    Article  Google Scholar 

  36. Noor AK, Aras R (2015) Potential of multimodal and multiuser interaction with virtual holography. Adv Eng Softw 81:1–6

    Article  Google Scholar 

  37. Oudah M, Al-Naji A, Chahl J (2021) Elderly care based on hand gestures using kinect sensor. Computers 10(1):5

    Article  Google Scholar 

  38. Park K-B, Choi SH, Lee JY, Ghasemi Y, Mohammed M, Jeong H (2021) Hands-free human–robot interaction using multimodal gestures and deep learning in wearable mixed reality. IEEE Access 9:55448–55464

    Article  Google Scholar 

  39. Raees MA, Ullah S (2018) Even-ve: Eyes visibility based egocentric navigation for virtual environments. IJIMAI 5(3):141–151

    Article  Google Scholar 

  40. Raees MA, Ullah S (2019) Gift: Gesture-based interaction by fingers tracking, an interaction technique for virtual environment. IJIMAI 5(5):115–125

    Article  Google Scholar 

  41. Raees M, Ullah S, Rahman SU (2019) Ven-3dve: vision based egocentric navigation for 3d virtual environments. Int J Interact Des Manuf (IJIDeM) 13(1):35–45

    Article  Google Scholar 

  42. Rautaray SS, Agrawal A (2012) Real time hand gesture recognition system for dynamic applications. Inter J UbiComp 3(1):21

    Article  Google Scholar 

  43. Rehman I, Sehat U, Khan D (2020) Multi layered multi task marker based interaction in information rich virtual environments. IJIMAI 6(4):57–67

    Article  Google Scholar 

  44. Rehman IU, Ullah S (2016) The effect of constraint based multi-modal virtual assembly on student’s learning. Sindh University Research journal-SURJ (Science Series) 48(1)

  45. Rehman IU, Ullah S (2022) Gestures and marker based low-cost interactive writing board for primary education. Multimed Tools Appl 81(1):1337–1356

    Article  Google Scholar 

  46. Rehman IU, Ullah S, Khan D, Khalid S, Alam A, Jabeen G, Rabbi I, Rahman HU, Ali N, Azher M et al (2020) Fingertip gestures recognition using leap motion and camera for interaction with virtual environment. Electronics 9(12):1986

    Article  Google Scholar 

  47. Rehman IU, Ullah S, Rabbi I (2014) The effect of semantic multi-modal aids using guided virtual assembly environment. In: 2014 International conference on open source systems & technologies, pp 87–92. IEEE

  48. Rehman IU, Ullah S, Rabbi I (2014) Measuring the student’s success rate using a constraint based multi-modal virtual assembly environment. In: International conference on augmented and virtual reality, pp 53–64. Springer

  49. Rehman IU, Ullah S, Raees M (2019) Two hand gesture based 3d navigation in virtual environments. IJIMAI 5(4):128–140

    Article  Google Scholar 

  50. Ritsos PD, Nigel W (2016) A cost-effective virtual environment for simulating and training powered wheelchairs manoeuvres. Proc Med Meets Virtual Reality NextMed/MMV 220:134

    Google Scholar 

  51. Sampson H, Kelly D, Wünsche BC, Amor R (2018) A hand gesture set for navigating and interacting with 3d virtual environments. In: 2018 International Conference on Image and Vision Computing New Zealand (IVCNZ), pp 1–6 . IEEE

  52. Shanthakumar VA, Peng C, Hansberger J, Cao L, Meacham S, Blakely V (2020) Design and evaluation of a hand gesture recognition approach for real-time interactions. Multimed Tools Appl 79(25):17707–17730

    Article  Google Scholar 

  53. Shao L (2016) Hand movement and gesture recognition using leap motion controller. Virtual Reality, Course Report, https://documentsn.com/document/5df6_hand-movement-and-gesture-recognition-using-leap-motion.htmlhttps://documentsn.com/document/5df6_hand-movement-and-gesture-recognition-using-leap-motion.html. Accessed 10 January 2022

  54. Smith JW, Thiagarajan S, Willis R, Makris Y, Torlak M (2021) Improved static hand gesture classification on deep convolutional neural networks using novel sterile training technique. Ieee Access 9:10893–10902

    Article  Google Scholar 

  55. VILLARREAL M (2007) Disponível em: https://commons.wikimedia.org/wiki/file:Scheme_human_hand_bones-en.svg>. Acesso em 4(08), 2016

  56. Wen F, Zhang Z, He T, Lee C (2021) Ai enabled sign language recognition and vr space bidirectional communication using triboelectric smart glove. Nat Commun 12(1):1–13

    Article  Google Scholar 

  57. Wu H, Luo W, Pan N, Nan S, Deng Y, Fu S, Yang L (2019) Understanding freehand gestures: a study of freehand gestural interaction for immersive vr shopping applications. Hum-centric Comput Inf Sci 9(1):1–26

    Article  Google Scholar 

  58. Yang L, HUANG J, Feng T, Hong-An W, Guo-Zhong D (2019) Gesture interaction in virtual reality. Virtual Real Intell Hardw 1(1):84–112

    Article  Google Scholar 

  59. Zhang Y, Meruvia-Pastor O (2017) Operating virtual panels with hand gestures in immersive vr games. In: International conference on augmented reality, virtual reality and computer graphics, pp 299–308. Springer

  60. Zhang Q, Zhu W, Zhu Q (2019) Real world hand gesture interaction in virtual reality. In: Journal of physics: Conference series. IOP Publishing, vol 1229, p 012027

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Inam Ur Rehman.

Ethics declarations

Conflict of Interests

The authors declare that they have no conflict of interest.

Additional information

Ethical approval

All the techniques brought under use in the concerned study with the code of conduct and with the proper ethical standards of the institutional and national research committee.

Informed consent

Informed consent was obtained from all individual participants included in the study.

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendices

Appendix A: : The SUS questionnaire

The SUS questionnaire consists of the following questions:

  1. 1.

    I think I would like to use this system frequently.

  2. 2.

    I found the system unnecessarily complex.

  3. 3.

    I thought the system was easy to use.

  4. 4.

    I think that I would need the support of a technical person to be able to use this system.

  5. 5.

    I found the various functions in this system were well integrated.

  6. 6.

    I thought there was too much inconsistency in this system.

  7. 7.

    I imagine that most people would learn to use this system very quickly.

  8. 8.

    I found the system very cumbersome to use.

  9. 9.

    I felt very confident using the system.

  10. 10.

    I needed to learn a lot of things before I could get going with this system.

Appendix B: Usability (SUS score) of all groups

The usability results of all groups are shown in Tables 456 and 7 using SUS score.

Table 4 SUS questionnaire results of G1 students’ opinion who used right-hand index fingertip based task selection from the menu and right-hand index fingertip based task operation
Table 5 SUS questionnaire results of G2 students’ opinion who used right-hand gestures for task selection and right-hand index fingertip based task operation
Table 6 SUS questionnaire results of G3 students’ opinions who used right-hand individual fingers for task selection and operation
Table 7 SUS questionnaire results of G4 students’ opinion who used time slice-based task selection while right-hand fingertip for task execution

Rights and permissions

Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Rehman, I.U., Ullah, S. & Khan, D. FPSI-Fingertip pose and state-based natural interaction techniques in virtual environments. Multimed Tools Appl 82, 20711–20740 (2023). https://doi.org/10.1007/s11042-022-13824-w

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-022-13824-w

Keywords

Navigation