Robust Hand Motion Tracking through Data Fusion of 5DT Data Glove and Nimble VR Kinect Camera Measurements
<p>Schematic representation of the test setup.</p> "> Figure 2
<p>(<b>a–c</b>) Three plots providing measured orientation of the wooden hand model for varying actual hand orientations (<span class="html-italic">i.e.</span>, pitch, yaw, and roll), indicated by red dashed unity lines; (<b>d1–i1</b>) Photos of the measured hand orientations; (<b>d2–i4</b>) Plots showing the measured metacarpophalangeal joint (MCP, blue) and proximal interphalangeal (PIP, green) joint angles, as determined with the Nimble VR system (square markers, □) and after fusion of the Data Glove data through the application of the Kalman filter (asterisk markers, *). Data is presented with error bars ranging from mean – 1·SD to mean + 1·SD. The actual angles at which the fingers were placed are indicated with the red dotted and dash-dotted lines, and are illustrated in the photos provided on the left. All plots show data collected on the index finger, unless otherwise specified below the photo on the left. Indicated in the top left of every graph are the mean and standard deviation (format: mean ± SD | mean ± SD) calculated over the entire hand orientation range before (left) and after implementation of the Kalman filter (right), for both joints. Note that these mean and standard deviations are calculated as the mean of the mean, and the mean of the standard deviations, calculated per 5 deg step. The triangle markers on the horizontal axes indicate whether the difference in mean PIP joint angle between Nimble NR and the Kalman filter is not statistically significant (note: MCP joint is not visualised in this way).</p> "> Figure 3
<p>Precision and accuracy for all fingers of the hand model and for all orientation ranges. (<b>a</b>) Pitch range; (<b>b</b>) Yaw range; (<b>c</b>) Roll range. For each graph, the absolute difference is given between the mean calculated joint angle (<span class="html-italic">i.e.</span>, mean of the mean joint angles) and the true joint angle. The accompanying standard deviation is given as well (<span class="html-italic">i.e.</span>, the mean of the standard deviations at all angles), shown as ± 2·SD. Note that the values provided here are equal to the values given in the top left of the graphs in <a href="#sensors-15-29868-f002" class="html-fig">Figure 2</a> minus the true joint angles of the assessed hand pose. A distinction is made between the Nimble VR data (left) and the Kalman filtered date (right), as well as between the metacarpophalangeal (MCP, blue) and proximal interphalangeal (PIP, green) joint. From left to right the fingers are presented; t = thumb, i = index, m = middle, r = ring, p = pinky.</p> "> Figure 4
<p>Finger joint estimates at dynamic finger flexions. (<b>a1–a4</b>) small hand size; (<b>b1–b4</b>) medium hand size; (<b>c1–c4</b>) and large hand size. Five sessions with 10 full finger flexion repetitions each were performed. All graphs show the Nimble VR system (red dash dotted line), Kalman filter output (green dashed line), and Marker Tracking measurements (blue continuous line). Left three graphs show the combined metacarpophalangeal (MCP) and proximal interphalangeal (PIP) joint angles of the third performed measurement session. The remaining nine plots show the Nimble VR and Kalman filter output plotted <span class="html-italic">vs.</span> the Marker Tracked angles of all fifty finger flexions performed per hand. The black line presents the true angle, and the plots are given for separate and combined joints: (a2–c2) MCP; (a3–c3) PIP; (a4–c4) combined MCP plus PIP joints. In the top right, the Pearson correlation coefficient between the measured angle and marker tracked angle are given (red = Nimble VR, green = Kalman filter output).</p> "> Figure 5
<p>Active index finger flexion comparison between Marker Tracking, Nimble VR, and Kalman filtered joint angles. Shown data are from the medium sized hand, first measurement session. (<b>a</b>) sum of the metacarpophalangeal (MCP) and proximal interphalangeal (PIP) joint angles plotted versus time; (<b>b</b>) MCP joint angles versus time; (<b>c</b>) PIP joint angles versus time.</p> "> Figure 6
<p>(<b>a1–a3</b>) Measured hand orientations; (<b>b1–c3</b>) metacarpophalangeal (MCP) index finger joint angle as a function of hand orientations (pitch, roll and yaw) for two different postures. (<b>b1–b3</b>) posture 1, flat hand; (<b>c1–c3</b>) posture 2, flexed fingers. Means, standard deviations, and variances were calculated for each of the 5 measurement sessions, and subsequently averaged over all sessions. Posture 1 is with a “flat hand”, MCP 0 deg and proximal interphalangeal (PIP) 0 deg joint angles; posture 2 is with flexed fingers, MCP 35 deg, PIP 55 deg. The thick (blue) line are the measured joint angles. At posture 1 (with MCP 0 deg) one would expect the measurement data as a function of changing hand orientation to be stable around 0 deg, and at posture 2 (with MCP 35 deg) stable around 35 deg. The dashed (green) lines are the measured variances. Vertical thick dashed lines (red) distinguish between stable and unstable orientation ranges, and the therein provided numbers are the mean variances for those ranges.</p> "> Figure 7
<p><b>Top</b>: (<b>a</b>) Index finger motion measured with 5DT Data Glove during 10 times as-fast-as-possible finger flexions of a single participant. Indicated with a bold line are the flexion and extension movements with their respective marker indicators (o and *) showing starting and stopping points of the finger motions; (<b>b</b>) calculated accelerations over the course of every finger flexion superimposed over each other; (<b>c</b>) similar as subfigure b, but for finger extensions. The horizontal (red) dotted lines show the mean peak accelerations determined from the datasets. (Note: this participant was faster than average.)</p> "> Figure 8
<p>Mean peak accelerations (±SD) during finger flexion and extension performed at normal speed (blue) and as fast as possible (green), calculated per participant and subsequently averaged over all 10 participants. (<b>a</b>) finger flexion measurements; (<b>b</b>) finger extension measurements.</p> "> Figure 9
<p>Marker Tracking video analyses. Example of one analyzed frame for the tracking of a single marker. From left to right: (<b>a</b>) original frame; (<b>b</b>) detected background in white, residual image information in black; (<b>c</b>) detected black glove where the black dots inside the hand show the edges of the markers; (<b>d</b>) residual image information after background and glove subtraction; and (<b>e</b>) red marker pixels detected from 4 overlaid on original image and mean shift cluster detection used to determine the center of the marker (shown with green + in image).</p> "> Figure 10
<p>Example of detected metacarpophalangeal joint (MCP, α) and proximal interphalangeal (PIP, β) angles as a function of detected marker locations. Angles provided in degrees.</p> "> Figure 11
<p>(<b>a1–b1</b>) Raw metacarpophalangeal (MCP, <b>a1</b>) and proximal interphalangeal (PIP, <b>b1</b>) joint measurement data collected through Marker Tracking and normalized Data Glove (DG) sensor readings for the index finger; (<b>a2–b2</b>) Joint flexions measured through Marker Tracking versus data glove sensor readings rescaled to range [0 210]. The gradient of the linear least-squares fit function provides the weights <math display="inline"><semantics> <mrow> <msub> <mi>w</mi> <mrow> <mi>D</mi> <msub> <mi>G</mi> <mrow> <mi>M</mi> <mi>C</mi> <mi>P</mi> </mrow> </msub> </mrow> </msub> </mrow> </semantics></math> and <math display="inline"> <semantics> <mrow> <msub> <mi>w</mi> <mrow> <mi>D</mi> <msub> <mi>G</mi> <mrow> <mi>P</mi> <mi>I</mi> <mi>P</mi> </mrow> </msub> </mrow> </msub> </mrow> </semantics> </math>.</p> ">
Abstract
:1. Introduction
1.1. HCI in Laparoscopic Training
1.2. Nimble VR
2. Kalman Filter Procedures and Parameter Settings
Determining the Kalman Filter paRameters
3. Methods
3.1. Test Setup
3.2. Wooden Hand Model Measurements to Validate the Kalman Filter Operation
3.3. Human Hand Measurements to Validate the Kalman Filter Operation
3.4. Dependent Measures
4. Results
4.1. Wooden Hand Model Orientation Measurements
4.2. Wooden Hand Model Finger Joint Measurements—Index and Thumb Fingers
4.3. Wooden Hand Model Finger Joint Measurements—All Fingers
4.4. Human Hand Active Finger Flexion Measurements
5. Discussion
5.1. Setup Limitations
5.2. Active Finger Flexion Measurements
5.3. Data Fusion Improvements
5.4. Application in Medical Field
6. Conclusions
Acknowledgments
Author Contributions
Conflicts of Interest
Appendix A
Orientation Dependent Variance Quantification
Methods
Results
Appendix B
Fingers Maximum Acceleration Determination
Methods
Results
Appendix C
Determination of Data Glove weights
References
- Rautaray, S.; Agrawal, A. Vision based hand gesture recognition for human computer interaction: A survey. Artif. Intell. Rev. 2015, 43, 1–54. [Google Scholar] [CrossRef]
- Suarez, J.; Murphy, R.R. Hand gesture recognition with depth images: A review. In Proceedings of the RO-MAN, 2012 IEEE, Paris, France, 9–13 September 2012; pp. 411–417.
- Erol, A.; Bebis, G.; Nicolescu, M.; Boyle, R.D.; Twombly, X. Vision-based hand pose estimation: A review. Comput. Vis. Image Underst. 2007, 108, 52–73. [Google Scholar] [CrossRef]
- Palacios, J.; Sagüés, C.; Montijano, E.; Llorente, S. Human-computer interaction based on hand gestures using RGB-D sensors. Sensors 2013, 13, 11842. [Google Scholar] [CrossRef] [PubMed]
- Sturman, D.J. Whole-Hand Input; Massachusetts Institute of Technology: Boston, MA, USA, 1991. [Google Scholar]
- Pintzos, G.; Rentzos, L.; Papakostas, N.; Chryssolouris, G. A novel approach for the combined use of ar goggles and mobile devices as communication tools on the shopfloor. Procedia CIRP 2014, 25, 132–137. [Google Scholar] [CrossRef]
- Kalra, P.; Magnenat-Thalmann, N.; Moccozet, L.; Sannier, G.; Aubel, A.; Thalmann, D. Real-time animation of realistic virtual humans. Comput. Graph. Appl. IEEE 1998, 18, 42–56. [Google Scholar] [CrossRef]
- Menache, A. Understanding Motion Capture for Computer Animation and Video Games; Morgan Kaufmann: San Diego, CA, USA, 2000. [Google Scholar]
- Ohn-Bar, E.; Trivedi, M.M. Hand gesture recognition in real time for automotive interfaces: A multimodal vision-based approach and evaluations. IEEE Trans. Intell. Transp. Syst. 2014, 15, 2368–2377. [Google Scholar] [CrossRef]
- Grätzel, C.; Fong, T.; Grange, S.; Baur, C. A non-contact mouse for surgeon-computer interaction. Technol. Health Care 2004, 12, 245–257. [Google Scholar] [PubMed]
- Rosa, G.M.; Elizondo, M.L. Use of a gesture user interface as a touchless image navigation system in dental surgery: Case series report. Imaging Sci. Dent. 2014, 44, 155–160. [Google Scholar] [CrossRef] [PubMed]
- Adhikarla, V.; Sodnik, J.; Szolgay, P.; Jakus, G. Exploring direct 3d interaction for full horizontal parallax light field displays using leap motion controller. Sensors 2015, 15, 8642. [Google Scholar] [CrossRef] [PubMed]
- Bachmann, D.; Weichert, F.; Rinkenauer, G. Evaluation of the leap motion controller as a new contact-free pointing device. Sensors 2014, 15, 214–233. [Google Scholar] [CrossRef] [PubMed]
- Guna, J.; Jakus, G.; Pogačnik, M.; Tomažič, S.; Sodnik, J. An analysis of the precision and reliability of the leap motion sensor and its suitability for static and dynamic tracking. Sensors 2014, 14, 3702–3720. [Google Scholar] [CrossRef] [PubMed]
- Dipietro, L.; Sabatini, A.M.; Dario, P. A survey of glove-based systems and their applications. IEEE Trans. Syst. Man Cybern. Part C Appl. Rev. 2008, 38, 461–482. [Google Scholar] [CrossRef]
- Pavlovic, V.I.; Sharma, R.; Huang, T.S. Visual interpretation of hand gestures for human-computer interaction: A review. IEEE Trans. Pattern Anal. Mach. Intell. 1997, 19, 677–695. [Google Scholar] [CrossRef]
- Wu, Y.; Huang, T. Vision-Based Gesture Recognition: A Review; Springer Heidelberg: Berlin, Germany, 1999; pp. 103–115. [Google Scholar]
- Preil, M. Optimum dose for EUV: Technical vs. Economic drivers. Future Fab Int. 2012, 41. [Google Scholar]
- Kurzweil, R. The Singularity is Near: When Humans Transcend Biology; Penguin: New York, NY, USA, 2005. [Google Scholar]
- Zhao, W.; Chai, J.; Xu, Y.-Q. Combining Marker-Based Mocap and RGB-D Camera for Acquiring High-Fidelity Hand Motion Data. In Proceedings of the ACM SIGGRAPH/Eurographics Symposium on Computer Animation, Eurographics Association, Lausanne, Switzerland, 29–31 July 2012; pp. 33–42.
- Rogalla, O.; Ehrenmann, M.; Dillmann, R. A sensor fusion approach for PbD. In Proceedings of the 1998 IEEE/RSJ International Conference on Intelligent Robots and Systems, Victoria, BC, Canada, 13–17 October 1998; pp. 1040–1045.
- Ehrenmann, M.; Zollner, R.; Knoop, S.; Dillmann, R. Sensor Fusion Approaches for Observation of User Actions in Programming by Demonstration. In Proceedings of the International Conference on Multisensor Fusion and Integration for Intelligent Systems, Baden-Baden, Germany, 20–22 August 2001; pp. 227–232.
- Hebert, P.; Hudson, N.; Ma, J.; Burdick, J. Fusion of Stereo Vision, Force-Torque, and Joint Sensors for Estimation of in-Hand Object Location. In Proceedings of the 2011 IEEE International Conference on Robotics and Automation (ICRA), Shanghai, China, 9–13 May 2011; pp. 5935–5941.
- Zhou, S.; Fei, F.; Zhang, G.; Liu, Y.; Li, W. Hand-writing motion tracking with vision-inertial sensor fusion: Calibration and error correction. Sensors 2014, 14, 15641–15657. [Google Scholar] [CrossRef] [PubMed]
- Fan, W.; Chen, X.; Wang, W.-H.; Zhang, X.; Yang, J.-H.; Lantz, V.; Wang, K.-Q. A Method of Hand Gesture Recognition Based on Multiple Sensors. In Proceedings of the 4th International Conference on Bioinformatics and Biomedical Engineering (iCBBE), Chengdu, China, 18–20 June 2010; pp. 1–4.
- Zou, W.; Yuan, K.; Liu, J.; Luo, B. A Method for Hand Tracking and Motion Recognizing in Chinese Sign Language. In Proceedings of the 2001 International Conferences on Info-Tech and Info-Net, Beijing, China, 29 October 2001; pp. 543–549.
- Brashear, H.; Starner, T.; Lukowicz, P.; Junker, H. Using Multiple Sensors for Mobile Sign Language Recognition. In Proceedings of the 7th IEEE International Symposium on Wearable Computers (ISWC 2003), White Plains, NY, USA, 21–23 October 2003.
- Khaleghi, B.; Khamis, A.; Karray, F.O.; Razavi, S.N. Multisensor data fusion: A review of the state-of-the-art. Inf. Fusion 2013, 14, 28–44. [Google Scholar] [CrossRef]
- Kunkler, K. The role of medical simulation: An overview. Int. J. Med. Robot. Comput. Assist. Surg. 2006, 2, 203–210. [Google Scholar] [CrossRef] [PubMed]
- Diesen, D.L.; Erhunmwunsee, L.; Bennett, K.M.; Ben-David, K.; Yurcisin, B.; Ceppa, E.P.; Omotosho, P.A.; Perez, A.; Pryor, A. Effectiveness of laparoscopic computer simulator versus usage of box trainer for endoscopic surgery training of novices. J. Surg. Educ. 2011, 68, 282–289. [Google Scholar] [CrossRef] [PubMed]
- Newmark, J.; Dandolu, V.; Milner, R.; Grewal, H.; Harbison, S.; Hernandez, E. Correlating virtual reality and box trainer tasks in the assessment of laparoscopic surgical skills. Am. J. Obst. Gynecol. 2007, 197, 546.e541–546.e544. [Google Scholar] [CrossRef] [PubMed]
- Munz, Y.; Kumar, B.D.; Moorthy, K.; Bann, S.; Darzi, A. Laparoscopic virtual reality and box trainers: Is one superior to the other? Surg. Endosc. Interv. Tech. 2004, 18, 485–494. [Google Scholar]
- Hull, L.; Kassab, E.; Arora, S.; Kneebone, R. Increasing the realism of a laparoscopic box trainer: A simple, inexpensive method. J. Laparoendosc. Ad. Surg. Tech. 2010, 20, 559–562. [Google Scholar] [CrossRef] [PubMed]
- Bent, J.; Chan, K. Human Factors in Aviation -h10 Flight Training and Simulation as Safety Generators; Academic Press: Burlington, MA, USA, 2010. [Google Scholar]
- Crothers, I.; Gallagher, A.; McClure, N.; James, D.; McGuigan, J. Experienced laparoscopic surgeons are automated to the "fulcrum effect": An ergonomic demonstration. Endoscopy 1999, 31, 365–369. [Google Scholar] [CrossRef] [PubMed]
- Gallagher, A.; McClure, N.; McGuigan, J.; Ritchie, K.; Sheehy, N. An ergonomic analysis of the fulcrum effect in the acquisition of endoscopic skills. Endoscopy 1998, 30, 617–620. [Google Scholar] [CrossRef] [PubMed]
- Eyal, R.; Tendick, F. Spatial ability and learning the use of an angled laparoscope in a virtual. Med. Meets Virtual Real. Outer Space Inner Space Virtual Space 2001, 81, 146. [Google Scholar]
- Oculus VR. LLC. Step into the Rift. Available online: www.oculus.com (accessed on 7 December 2015).
- Sony Computer Entertainment America LLC. Project Morpheus. Available online: https://www.playstation.com/en-us/explore/project-morpheus/ (accessed on 7 December 2015).
- HTC Corporation. Htc Vive. Available online: http://www.htcvr.com (accessed on 7 December 2015).
- SAMSUNG ELECTRONICS CO. LTD. Samsung Gear vr. Available online: http://www.samsung.com/global/microsite/gearvr/gearvr_features.html (accessed on 7 December 2015).
- Ahlberg, G.; Enochsson, L.; Gallagher, A.G.; Hedman, L.; Hogman, C.; McClusky, D.A., III; Ramel, S.; Smith, C.D.; Arvidsson, D. Proficiency-based virtual reality training significantly reduces the error rate for residents during their first 10 laparoscopic cholecystectomies. Am. J. Surg. 2007, 193, 797–804. [Google Scholar] [CrossRef] [PubMed]
- Gallagher, A.G.; Ritter, E.M.; Champion, H.; Higgins, G.; Fried, M.P.; Moses, G.; Smith, C.D.; Satava, R.M. Virtual reality simulation for the operating room: Proficiency-based training as a paradigm shift in surgical skills training. Ann. Surg. 2005, 241, 364–372. [Google Scholar] [CrossRef] [PubMed]
- Seymour, N.E.; Gallagher, A.G.; Roman, S.A.; O’Brien, M.K.; Bansal, V.K.; Andersen, D.K.; Satava, R.M. Virtual reality training improves operating room performance: Results of a randomized, double-blinded study. Ann. Surg. 2002, 236, 458–464. [Google Scholar] [CrossRef] [PubMed]
- Munz, Y.; Almoudaris, A.M.; Moorthy, K.; Dosis, A.; Liddle, A.D.; Darzi, A.W. Curriculum-based solo virtual reality training for laparoscopic intracorporeal knot tying: Objective assessment of the transfer of skill from virtual reality to reality. Am. J. Surg. 2007, 193, 774–783. [Google Scholar] [CrossRef] [PubMed]
- Wang, R.; Paris, S.; Popović, J. 6d Hands: Markerless Hand-Tracking for Computer Aided Design. In Proceedings of the 24th Annual ACM Symposium on User Interface Software and Technology, Santa Barbara, CA, USA, 16–19 October 2011; pp. 549–558.
- El-laithy, R.A.; Jidong, H.; Yeh, M. Study on the Use of Microsoft Kinect for Robotics Applications. In Proceedings of the 2012 IEEE/ION Position Location and Navigation Symposium (PLANS), Myrtle Beach, SC, USA, 23–26 April 2012; pp. 1280–1288.
- Yonjae, K.; Kim, P.C.W.; Selle, R.; Shademan, A.; Krieger, A. Experimental Evaluation of Contact-Less hand Tracking Systems for Tele-Operation of Surgical Tasks. In Proceedings of the 2014 IEEE International Conference on Robotics and Automation (ICRA), Hong Kong, China, 31 May–7 June 2014; pp. 3502–3509.
- Arkenbout, E.A.; Winter, J.C.F.D.; Breedveld, P. Using kinect with 3gear systems software to determine hand and finger movement: An assessment for minimally invasive surgery applications. Des. Med. Devices Eur. 2014. [Google Scholar] [CrossRef]
- Weichert, F.; Bachmann, D.; Rudak, B.; Fisseler, D. Analysis of the accuracy and robustness of the leap motion controller. Sensors 2013, 13, 6380–6393. [Google Scholar] [CrossRef] [PubMed]
- Kim, Y.; Leonard, S.; Shademan, A.; Krieger, A.; Kim, P.W. Kinect technology for hand tracking control of surgical robots: Technical and surgical skill comparison to current robotic masters. Surg. Endosc. 2014, 28, 1993–2000. [Google Scholar] [CrossRef] [PubMed]
- Technologies, F.D. Data gloves. Available online: http://www.5dt.com/?page_id=34 (accessed on 14 January 2015).
- Bishop, G.; Welch, G. An introduction to the kalman filter. Proc. SIGGRAPH Course 2001, 8, 1–47. [Google Scholar]
- Harvey, A.C. Forecasting, Structural Time Series Models And The Kalman Filter; Cambridge University Press: Cambridge, UK, 1990. [Google Scholar]
- Haykin, S. Kalman Filtering and Neural Networks; John Wiley & Sons: New York, NY, USA, 2004. [Google Scholar]
- Nimble VR. Nimble vr sdk v0.9.36. Available online: http://nimblevr.com/download.html (accessed on 14 January 2015).
- Fukunaga, K.; Hostetler, L. The estimation of the gradient of a density function, with applications in pattern recognition. IEEE Trans. Inf. Theory 1975, 21, 32–40. [Google Scholar] [CrossRef]
- Dianat, I.; Haslegrave, C.M.; Stedmon, A.W. Methodology for evaluating gloves in relation to the effects on hand performance capabilities: A literature review. Ergonomics 2012, 55, 1429–1451. [Google Scholar] [CrossRef] [PubMed]
- Simone, L.K.; Elovic, E.; Kalambur, U.; Kamper, D. A Low Cost Method to Measure Finger Flexion in Individuals with Reduced Hand and Finger Range of Motion. In Proceedings of the IEMBS 04. 26th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, San Francisco, CA, USA, 1–5 September 2004; pp. 4791–4794.
- Kim, D.; Hilliges, O.; Izadi, S.; Butler, A.D.; Chen, J.; Oikonomidis, I.; Olivier, P. Digits: Freehand 3D Interactions Anywhere Using a Wrist-Worn Gloveless Sensor. In Proceedings of the 25th annual ACM symposium on User interface software and technology, Cambridge, MA, USA, 7–10 October 2012; pp. 167–176.
- Nymoen, K.; Haugen, M.R.; Jensenius, A.R. Mumyo–Evaluating and Exploring the MYO Armband for Musical Interaction. In Proceedings of the International Conference on New Interfaces For Musical Expression, Baton Rouge, LA, USA, 31 May–3 June 2015; pp. 1–4.
- Thalmic Labs Inc. Myo-Touch-Free Control. Available online: www.thalmic.com (accessed on 7 December 2015).
- Lynch, J.; Aughwane, P.; Hammond, T.M. Video games and surgical ability: A literature review. J. Surg. Educ. 2010, 67, 184–189. [Google Scholar] [CrossRef] [PubMed]
- Levison, W.H.; Lancraft, R.; Junker, A. Effects of Simulator Delays on Performance and Learning in a Roll-Axis Tracking Task. In Proceedings of the 15th Annual Conference on Manual Control, Wright State University, Dayton, Dayton, OH, USA, 20–22 March 1979; pp. 168–186.
- Bowersox, J.C.; Cordts, P.R.; LaPorta, A.J. Use of an intuitive telemanipulator system for remote trauma surgery: An experimental study. J. Am. Coll. Surg. 1998, 186, 615–621. [Google Scholar] [CrossRef]
- Fabrlzio, M.D.; Lee, B.R.; Chan, D.Y.; Stoianovici, D.; Jarrett, T.W.; Yang, C.; Kavoussi, L.R. Effect of time delay on surgical performance during telesurgical manipulation. J. Endourol. 2000, 14, 133–138. [Google Scholar] [CrossRef]
- Ottensmeyer, M.P.; Hu, J.; Thompson, J.M.; Ren, J.; Sheridan, T.B. Investigations into performance of minimally invasive telesurgery with feedback time delays. Presence Teleoper. Virtual Environ. 2000, 9, 369–382. [Google Scholar] [CrossRef]
- Arsenault, R.; Ware, C. Eye-Hand Co-Ordination with Force Feedback. In Proceedings of the SIGCHI conference on Human Factors in Computing Systems, The Hague, The Netherlands, 1–6 April 2000; pp. 408–414.
- Reiley, C.E.; Akinbiyi, T.; Burschka, D.; Chang, D.C.; Okamura, A.M.; Yuh, D.D. Effects of visual force feedback on robot-assisted surgical task performance. J. Thorac. Cardiovasc. Surg. 2008, 135, 196–202. [Google Scholar] [CrossRef] [PubMed]
- Lécuyer, A. Simulating haptic feedback using vision: A survey of research and applications of pseudo-haptic feedback. Presence Teleoperators Virtual Environ. 2009, 18, 39–53. [Google Scholar] [CrossRef]
- Buchmann, V.; Violich, S.; Billinghurst, M.; Cockburn, A. Fingartips: Gesture Based Direct Manipulation in Augmented Reality. In Proceedings of the 2nd international conference on Computer graphics and interactive techniques in Australasia and South East Asia, Singapore, 15–18 June 2004; pp. 212–221.
- Judkins, T.N.; DiMartino, A.; Doné, K.; Hallbeck, M.S.; Oleynikov, D. Effect of handle design and target location on wrist posture during aiming with a laparoscopic tool. Proc. Human Factors Ergon. Soc. Ann. Meet. 2004, 48, 1464–1468. [Google Scholar] [CrossRef]
- Saunders, J.; Knill, D. Humans use continuous visual feedback from the hand to control fast reaching movements. Exp. Brain Res. 2003, 152, 341–352. [Google Scholar] [CrossRef] [PubMed]
© 2015 by the authors; licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons by Attribution (CC-BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Arkenbout, E.A.; De Winter, J.C.F.; Breedveld, P. Robust Hand Motion Tracking through Data Fusion of 5DT Data Glove and Nimble VR Kinect Camera Measurements. Sensors 2015, 15, 31644-31671. https://doi.org/10.3390/s151229868
Arkenbout EA, De Winter JCF, Breedveld P. Robust Hand Motion Tracking through Data Fusion of 5DT Data Glove and Nimble VR Kinect Camera Measurements. Sensors. 2015; 15(12):31644-31671. https://doi.org/10.3390/s151229868
Chicago/Turabian StyleArkenbout, Ewout A., Joost C. F. De Winter, and Paul Breedveld. 2015. "Robust Hand Motion Tracking through Data Fusion of 5DT Data Glove and Nimble VR Kinect Camera Measurements" Sensors 15, no. 12: 31644-31671. https://doi.org/10.3390/s151229868
APA StyleArkenbout, E. A., De Winter, J. C. F., & Breedveld, P. (2015). Robust Hand Motion Tracking through Data Fusion of 5DT Data Glove and Nimble VR Kinect Camera Measurements. Sensors, 15(12), 31644-31671. https://doi.org/10.3390/s151229868