Nothing Special   »   [go: up one dir, main page]

skip to main content
research-article
Open access

Translational and Rotational Arrow Cues (TRAC) Navigation Method for Manual Alignment Tasks

Published: 10 February 2020 Publication History

Abstract

Many tasks in image-guided surgery require a clinician to manually position an instrument in space, with respect to a patient, with five or six degrees of freedom (DOF). Displaying the current and desired pose of the object on a 2D display such as a computer monitor is straightforward. However, providing guidance to accurately and rapidly navigate the object in 5-DOF or 6-DOF is challenging. Guidance is typically accomplished by showing distinct orthogonal viewpoints of the workspace, requiring simultaneous alignment in all views. Although such methods are commonly used, they can be quite unintuitive, and it can take a long time to perform an accurate 5-DOF or 6-DOF alignment task. In this article, we describe a method of visually communicating navigation instructions using translational and rotational arrow cues (TRAC) defined in an object-centric frame, while displaying a single principal view that approximates the human’s egocentric view of the physical object. The target pose of the object is provided but typically is used only for the initial gross alignment. During the accurate-alignment stage, the user follows the unambiguous arrow commands. In a series of human-subject studies, we show that the TRAC method outperforms two common orthogonal-view methods—the triplanar display, and a sight-alignment method that closely approximates the Acrobot Navigation System—in terms of time to complete 5-DOF and 6-DOF navigation tasks. We also find that subjects can achieve 1 mm and 1° accuracy using the TRAC method with a median completion time of less than 20 seconds.

References

[1]
L. Adams, W. Krybus, D. Meyer-Ebrecht, R. Rueger, J. M. Gilsbach, R. Moesges, and G. Schloendorff. 1990. Computer-assisted surgery. IEEE Comput. Graph. 10, 3 (1990), 43--51.
[2]
A. R. W. Barrett, B. L. Davies, M. P. S. F. Gomes, S. J. Harris, J. Henckel, M. Jakopec, V. Kannan, F. M. Rodriguez y Baena, and J. P. Cobb. 2007. Computer-assisted hip resurfacing surgery using the Acrobot®Navigation System. Proc. Inst. Mech. Eng. H. 221 (2007), 773--786.
[3]
E. M. Boctor, R. J. Webster, M. A. Choti, R. H. Taylor, and G. Fichtinger. 2004. Robotically assisted ablative treatment guided by freehand 3D ultrasound. In International Congress Series, Vol. 1268. Elsevier, 503--508.
[4]
T. L. Bruns and R. J. Webster III. 2017. An image guidance system for positioning robotic cochlear implant insertion tools. In Medical Imaging 2017: Image-Guided Procedures, Robotic Interventions, and Modeling. Proc. SPIE 10135, Article 101350O, 6 pages.
[5]
E. Coluccia and G. Louse. 2004. Gender differences in spatial orientation: A review. J. Environ. Psychol. 24, 3 (2004), 329--340.
[6]
G. Dagnino, I. Georgilas, P. Köhler, S. Morad, R. Atkins, and S. Dogramadzi. 2016. Navigation system for robot-assisted intra-articular lower-limb fracture surgery. Int. J. Comput. Assist. Radiol. Surg. 11, 10 (2016), 1831--1843.
[7]
A. M. DiGioia, B. Jaramaz, M. Blackwell, D. A. Simon, F. Morgan, J. E. Moody, C. Nikou, et al. 1998. Image guided navigation system to measure intraoperatively acetabular implant alignment. Clin. Orthop. Relat. Res. 355 (1998), 8--22.
[8]
R. L. Galloway, R. J. Maciunas, and C. A. Edwards. 1992. Interactive image-guided neurosurgery. IEEE Trans. Biomed. Eng. 39, 12 (1992), 1226--1231.
[9]
National Eye Institute. 2015. Facts About Color Blindness. Retrieved January 10, 2020 from https://nei.nih.gov/health/color_blindness/facts_about.
[10]
R. J. K. Jacob, L. E. Sibert, D. C. McFarlane, and M. P. Mullen Jr. 1994. Integrality and separability of input devices. ACM Trans. Comput. Hum. Interac. 1, 1 (1994), 3--26.
[11]
L. Joskowicz, C. Milgrom, A. Simkin, L. Tockus, and Z. Yaniv. 1998. FRACAS: A system for computer-aided image-guided long bone fracture surgery. Comput. Aided Surg. 3, 6 (1998), 271--288.
[12]
G. A. Krombach, T. Schmitz-Rode, B. B. Wein, J. Meyer, J. E. Wildberger, K. Brabant, and R. W. Günther. 2000. Potential of a new laser target system for percutaneous CT-guided nerve blocks. Neuroradiology 42, 11 (2000), 838--841.
[13]
R. F. Labadie and J. M. Fitzpatrick. 2016. Image-Guided Surgery: Fundamentals and Clinical Applications in Otolaryngology. Plural Publishing, San Diego, CA.
[14]
L. Leon, F. M. Warren, and J. J. Abbott. 2018. Optimizing the magnetic dipole-field source for magnetically guided cochlear-implant electrode-array insertions. J. Med. Robot. Res. 3, 1 (2018), 1850004.
[15]
K. M. Lynch and F. C. Park. 2017. Modern Robotics: Mechanics, Planning and Control. Cambridge University Press.
[16]
A. Moscatelli, M. Mezzetti, and F. Lacquaniti. 2012. Modeling psychophysical data at the population-level: The generalized linear mixed model. J. Vis. 12, 11 (2012), 1–17.
[17]
L. M. Parsons. 1995. Inability to reason about an object’s orientation using an axis and angle of rotation. J. Exp. Psychol. Hum. Percept. Perform. 21, 6 (1995), 1259--1277.
[18]
S. Sun, M. Gilbertson, and B. W. Anthony. 2013. Computer-guided ultrasound probe realignment by optical tracking. In Proceedings of the IEEE International Symposium on Biomedical Imaging. 21--24.
[19]
J. Traub, P. Stefan, S. M. Heining, C. Riquarts, T. Sielhorst, E. Euler, and N. Navab. 2006. Towards a hybrid navigation interface: Comparison of a slice based navigation system with in-situ visualization. In Proceedings of the International Workshop on Medical Imaging and Virtual Reality. 179--186.
[20]
D. E. Usevitch and J. J. Abbott. 2018. Translational and rotational arrow cues (TRAC) outperforms triplanar display for use in 6-DOF IGS navigation tasks. In Proceedings of the Hamlyn Symposium on Medical Robotics. 104--105.
[21]
C. Ware and R. Arsenault. 2004. Frames of reference in virtual object rotation. In Proceedings of the 1st Symposium on Applied Perception in Graphics and Visualization. 135--141.
[22]
Z. Yaniv, F. Banovac, D. Lindisch, P. Cheng, K. Cleary, V. Watson, E. Wilson, H. Abeledo, E. Campos-Nanez, and T. Popa. 2009. Needle-based interventions with the image-guided surgery toolkit (IGSTK): From phantoms to clinical trials. IEEE Trans. Biomed. Eng. 57, 4 (2009), 922--933.

Cited By

View all
  • (2024)Screw-Tip Soft Magnetically Steerable NeedlesIEEE Transactions on Medical Robotics and Bionics10.1109/TMRB.2023.32657216:1(4-17)Online publication date: Feb-2024
  • (2023)Review of Enhanced Handheld Surgical DrillsCritical Reviews in Biomedical Engineering10.1615/CritRevBiomedEng.202304910651:6(29-50)Online publication date: 2023
  • (2023)Data generalization processing and fusion machine translation system based on virtual reality technologySecond International Conference on Electronic Information Technology (EIT 2023)10.1117/12.2685452(9)Online publication date: 15-Aug-2023
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Transactions on Applied Perception
ACM Transactions on Applied Perception  Volume 17, Issue 1
January 2020
79 pages
ISSN:1544-3558
EISSN:1544-3965
DOI:10.1145/3382777
Issue’s Table of Contents
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 10 February 2020
Accepted: 01 September 2019
Revised: 01 July 2019
Received: 01 August 2018
Published in TAP Volume 17, Issue 1

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Image-guided surgery
  2. pose matching
  3. visual guidance

Qualifiers

  • Research-article
  • Research
  • Refereed

Funding Sources

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)139
  • Downloads (Last 6 weeks)22
Reflects downloads up to 25 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Screw-Tip Soft Magnetically Steerable NeedlesIEEE Transactions on Medical Robotics and Bionics10.1109/TMRB.2023.32657216:1(4-17)Online publication date: Feb-2024
  • (2023)Review of Enhanced Handheld Surgical DrillsCritical Reviews in Biomedical Engineering10.1615/CritRevBiomedEng.202304910651:6(29-50)Online publication date: 2023
  • (2023)Data generalization processing and fusion machine translation system based on virtual reality technologySecond International Conference on Electronic Information Technology (EIT 2023)10.1117/12.2685452(9)Online publication date: 15-Aug-2023
  • (2022)Precueing Sequential Rotation Tasks in Augmented RealityProceedings of the 28th ACM Symposium on Virtual Reality Software and Technology10.1145/3562939.3565641(1-11)Online publication date: 29-Nov-2022
  • (2022)Towards reducing visual workload in surgical navigation: proof-of-concept of an augmented reality haptic guidance systemComputer Methods in Biomechanics and Biomedical Engineering: Imaging & Visualization10.1080/21681163.2022.215237211:4(1073-1080)Online publication date: 5-Dec-2022

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Login options

Full Access

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media