Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3030024.3040985acmconferencesArticle/Chapter ViewAbstractPublication PagesiuiConference Proceedingsconference-collections
poster

Towards Automatic Skill Evaluation in Microsurgery

Published: 07 March 2017 Publication History

Abstract

In the past decade, eye tracking has emerged as a promising answer to the increasing needs of understanding surgical expertise. The implicit desire is to design an intelligent user interface (IUI) to monitor and assess the competency of surgical trainees. In this paper, for the first time in microsurgery, we explore the potential for a surgical automatic skill assessment through a combination of machine learning techniques, computational modeling, and eye tracking. We present primary findings from a random forest classification method where we achieved about 70% recognition rate for the detection of expert and novice group. This leads us to a conclusion that prediction of the micro-surgeon performance is possible, can be automated, and that the eye movement data carry important information about the skills of micro-surgeons.

References

[1]
N. Ahmidi, G. D. Hager, L. Ishii, G. Fichtinger, G. L. Gallia, and M. Ishii. 2010. Surgical task and skill classi?cation from eye tracking and tool motion in minimally invasive surgery. In International Conference on Medical Image Computing and Computer-Assisted Intervention. Springer, 295?302.
[2]
R. Bertram, L. Helle, J.K. Kaakinen, and E. Svedström. 2013. The effect of expertise on eye movement behaviour in medical image perception. PloS one 8, 6 (2013), e66169.
[3]
S. Eivazi. 2016. Eye gaze patterns in micro-neurosurgery: from remote to ocular-based eye tracker. Ph.D. Dissertation. University of Eastern Finland, Dissertations in Forestry and Natural Sciences.
[4]
S. Eivazi, R. Bednarik, M. Tukiainen, M. von und zu Fraunberg, V. Leinonen, and J. E. Jääskeläinen. 2012. Gaze behaviour of expert and novice microneurosurgeons differs during observations of tumor removal recordings. In Proceedings of the Symposium on Eye Tracking Research and Applications. ACM, 377?380.
[5]
F. Hermens, R. Flin, and I. Ahmed. 2013. Eye movements in surgery: A literature review. Journal of Eye Movement Research 6, 4 (2013).
[6]
T. Kübler, S. Eivazi, and E. Kasneci. 2015. Automated Visual Scanpath Analysis Reveals the Expertise Level of Micro-neurosurgeons. In MICCAI Workshop on Interventional Microscopy.
[7]
T. C. Kübler, C. Rothe, U. Schiefer, W. Rosenstiel, and E. Kasneci. 2016. SubsMatch 2.0: Scanpath comparison and classi?cation based on subsequence frequencies. Behavior Research Methods (2016), 1?17.
[8]
H. L. Kundel and P. S. La Follette Jr. 1972. Visual search patterns and experience with radiological images 1. Radiology 103, 3 (1972), 523?528.
[9]
S. Ramachandran, A. M. Ghanem, and S. R. Myers. 2013. Assessment of microsurgery competency-where are we now? Microsurgery 33, 5 (2013), 406?415.
[10]
L. Richstone, M. J. Schwartz, C. Seideman, J. Cadeddu, S. Marshall, and L. R. Kavoussi. 2010. Eye metrics as an objective assessment of surgical skill. Annals of surgery 252, 1 (2010), 177?182.
[11]
Dario D Salvucci and Joseph H Goldberg. 2000. Identifying ?xations and saccades in eye-tracking protocols. In Proceedings of the 2000 symposium on Eye tracking research & applications. ACM, 71?78.
[12]
T. Tien, P. H. Pucher, M. H. Sodergren, K. Sriskandarajah, G. Yang, and A. Darzi. 2014. Eye tracking for skills assessment and training: a systematic review. journal of surgical research 191, 1 (2014), 169?178.

Cited By

View all
  • (2023)What Is Hidden in Clear Sight and How to Find It—A Survey of the Integration of Artificial Intelligence and Eye TrackingInformation10.3390/info1411062414:11(624)Online publication date: 20-Nov-2023
  • (2023)Using electroencephalography to explore neurocognitive correlates of procedural proficiency: A pilot study to compare experts and novices during simulated endotracheal intubationBrain and Cognition10.1016/j.bandc.2022.105938165(105938)Online publication date: Feb-2023
  • (2023)What we see is what we do: a practical Peripheral Vision-Based HMM framework for gaze-enhanced recognition of actions in a medical procedural taskUser Modeling and User-Adapted Interaction10.1007/s11257-022-09352-933:4(939-965)Online publication date: 4-Jan-2023
  • Show More Cited By

Index Terms

  1. Towards Automatic Skill Evaluation in Microsurgery

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    IUI '17 Companion: Companion Proceedings of the 22nd International Conference on Intelligent User Interfaces
    March 2017
    246 pages
    ISBN:9781450348935
    DOI:10.1145/3030024
    Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 07 March 2017

    Check for updates

    Author Tags

    1. eye tracking
    2. machine learning
    3. micro-neurosurgery
    4. skill assessment

    Qualifiers

    • Poster

    Conference

    IUI'17
    Sponsor:

    Acceptance Rates

    IUI '17 Companion Paper Acceptance Rate 63 of 272 submissions, 23%;
    Overall Acceptance Rate 746 of 2,811 submissions, 27%

    Upcoming Conference

    IUI '25

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)5
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 17 Dec 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2023)What Is Hidden in Clear Sight and How to Find It—A Survey of the Integration of Artificial Intelligence and Eye TrackingInformation10.3390/info1411062414:11(624)Online publication date: 20-Nov-2023
    • (2023)Using electroencephalography to explore neurocognitive correlates of procedural proficiency: A pilot study to compare experts and novices during simulated endotracheal intubationBrain and Cognition10.1016/j.bandc.2022.105938165(105938)Online publication date: Feb-2023
    • (2023)What we see is what we do: a practical Peripheral Vision-Based HMM framework for gaze-enhanced recognition of actions in a medical procedural taskUser Modeling and User-Adapted Interaction10.1007/s11257-022-09352-933:4(939-965)Online publication date: 4-Jan-2023
    • (2021)Expertise Classification of Soccer Goalkeepers in Highly Dynamic Decision Tasks: A Deep Learning Approach for Temporal and Spatial Feature Recognition of Fixation Image Patch SequencesFrontiers in Sports and Active Living10.3389/fspor.2021.6925263Online publication date: 26-Jul-2021
    • (2021)Soccer goalkeeper expertise identification based on eye movementsPLOS ONE10.1371/journal.pone.025107016:5(e0251070)Online publication date: 19-May-2021
    • (2021)TEyeD: Over 20 Million Real-World Eye Images with Pupil, Eyelid, and Iris 2D and 3D Segmentations, 2D and 3D Landmarks, 3D Eyeball, Gaze Vector, and Eye Movement Types2021 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)10.1109/ISMAR52148.2021.00053(367-375)Online publication date: Oct-2021
    • (2021)Fully Convolutional Neural Networks for Raw Eye Tracking Data Segmentation, Generation, and Reconstruction2020 25th International Conference on Pattern Recognition (ICPR)10.1109/ICPR48806.2021.9413268(142-149)Online publication date: 10-Jan-2021
    • (2021)Explainable Online Validation of Machine Learning Models for Practical Applications2020 25th International Conference on Pattern Recognition (ICPR)10.1109/ICPR48806.2021.9412959(3304-3311)Online publication date: 10-Jan-2021
    • (2021)1000 Pupil Segmentations in a Second using Haar Like Features and Statistical Learning2021 IEEE/CVF International Conference on Computer Vision Workshops (ICCVW)10.1109/ICCVW54120.2021.00386(3459-3469)Online publication date: Oct-2021
    • (2020)A Study of Expert/Novice Perception in Arthroscopic Shoulder SurgeryProceedings of the 4th International Conference on Medical and Health Informatics10.1145/3418094.3418135(71-77)Online publication date: 14-Aug-2020

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media