Abstract
Echocardiography (ECHO) is commonly used to assist in the diagnosis of cardiovascular diseases (CVDs). However, manually conducting standardized ECHO view acquisitions by manipulating the probe demands significant experience and training for sonographers. In this work, we propose a visual navigation system for cardiac ultrasound view planning, designed to assist novice sonographers in accurately obtaining the required views for CVDs diagnosis. The system introduces a view-agnostic feature extractor to explore the spatial relationships between source frame views, learning the relative rotations among different frames for network regression, thereby facilitating transfer learning to improve the accuracy and robustness of identifying specific target planes. Additionally, we present a target consistency loss to ensure that frames within the same scan regress to the same target plane. The experimental results demonstrate that the average error in the apical four-chamber view (A4C) can be reduced to 7.055\(^\circ \). Moreover, results from practical clinical validation indicate that, with the guidance of the visual navigation system, the average time for acquiring A4C view can be reduced by at least 3.86 times, which is instructive for the clinical practice of novice sonographers.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Brégier, R.: Deep regression on manifolds: a 3D rotation case study. In: 2021 International Conference on 3D Vision (3DV), pp. 166–174. IEEE (2021)
Dave, J.K., Mc Donald, M.E., Mehrotra, P., Kohut, A.R., Eisenbrey, J.R., Forsberg, F.: Recent technological advancements in cardiac ultrasound imaging. Ultrasonics 84, 329–340 (2018)
Duffy, G., et al.: High-throughput precision phenotyping of left ventricular hypertrophy with cardiovascular deep learning. JAMA Cardiol. 7(4), 386–395 (2022)
Feigenbaum, H.: Evolution of echocardiography. Circulation 93(7), 1321–1327 (1996)
He, K., Zhang, X., Ren, S., Sun, J.: Deep residual learning for image recognition. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 770–778 (2016)
Li, K., et al.: Autonomous navigation of an ultrasound probe towards standard scan planes with deep reinforcement learning. In: 2021 IEEE International Conference on Robotics and Automation (ICRA), pp. 8302–8308. IEEE (2021)
Li, K., Xu, Y., Wang, J., Ni, D., Liu, L., Meng, M.Q.H.: Image-guided navigation of a robotic ultrasound probe for autonomous spinal sonography using a shadow-aware dual-agent framework. IEEE Trans. Med. Robot. Bionics 4(1), 130–144 (2021)
Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
Narang, A., et al.: Utility of a deep-learning algorithm to guide novices to acquire echocardiograms for limited diagnostic use. JAMA Cardiol. 6(6), 624–632 (2021)
Olivier, D., McGuffin, M.J., Laporte, C.: Utilizing sonographer visual attention for probe movement guidance in cardiac point of care ultrasound. In: 2023 IEEE 20th International Symposium on Biomedical Imaging (ISBI), pp. 1–5. IEEE (2023)
Paszke, A., et al.: PyTorch: an imperative style, high-performance deep learning library. In: Advances in Neural Information Processing Systems, vol. 32 (2019)
Salehi, S.S.M., Khan, S., Erdogmus, D., Gholipour, A.: Real-time deep pose estimation with geodesic loss for image-to-template rigid registration. IEEE Trans. Med. Imaging 38(2), 470–481 (2018)
Sandler, M., Howard, A., Zhu, M., Zhmoginov, A., Chen, L.C.: MobileNetV2: inverted residuals and linear bottlenecks. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 4510–4520 (2018)
Wu, L., et al.: Standard echocardiographic view recognition in diagnosis of congenital heart defects in children using deep learning based on knowledge distillation. Front. Pediatr. 9, 770182 (2022)
Yeung, P.H., Aliasi, M., Haak, M., the INTERGROWTH-21st Consortium, Xie, W., Namburete, A.I.: Adaptive 3D localization of 2D freehand ultrasound brain images. In: Wang, L., Dou, Q., Fletcher, P.T., Speidel, S., Li, S. (eds.) MICCAI 2022. LNCS, vol. 13434, pp. 207–217. Springer, Cham (2022). https://doi.org/10.1007/978-3-031-16440-8_20
Zhao, C., Droste, R., Drukker, L., Papageorghiou, A.T., Noble, J.A.: Visual-assisted probe movement guidance for obstetric ultrasound scanning using landmark retrieval. In: de Bruijne, M., et al. (eds.) MICCAI 2021, Part VIII. LNCS, vol. 12908, pp. 670–679. Springer, Cham (2021). https://doi.org/10.1007/978-3-030-87237-3_64
Acknowledgments
This work is supported in part by the Major Project of Science and Technology Innovation 2030 - New Generation Artificial Intelligence under Grant 2021ZD0140407, in part by the National Natural Science Foundation of China under Grant U21A20523, in part by the Beijing Natural Science Foundation under Grant (L222152, 7244325).
Author information
Authors and Affiliations
Corresponding authors
Editor information
Editors and Affiliations
Ethics declarations
Disclosure of Interests
The authors have no competing interests to declare that are relevant to the content of this article.
1 Electronic supplementary material
Below is the link to the electronic supplementary material.
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Bao, M. et al. (2024). Real-World Visual Navigation for Cardiac Ultrasound View Planning. In: Linguraru, M.G., et al. Medical Image Computing and Computer Assisted Intervention – MICCAI 2024. MICCAI 2024. Lecture Notes in Computer Science, vol 15001. Springer, Cham. https://doi.org/10.1007/978-3-031-72378-0_30
Download citation
DOI: https://doi.org/10.1007/978-3-031-72378-0_30
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-72377-3
Online ISBN: 978-3-031-72378-0
eBook Packages: Computer ScienceComputer Science (R0)