Abstract
Accurate segmentation of organs-at-risks (OARs) from Computed Tomography (CT) image is a key step for efficient planning of radiation therapy for nasopharyngeal carcinoma (NPC) treatment. Convolutional Neural Networks (CNN) have recently become the state-of-the-art automated OARs image segmentation method. However, due to the low contrast of head and neck organism tissues in CT, the fully automatic segmentation may still need to be refined to become accurate and robust enough for clinical use. We propose a deep learning-based multi-organ interactive segmentation method to improve the results obtained by an automatic CNN and to reduce user interactions during refinement for higher accuracy. We use one CNN to obtain an initial automatic segmentation, on which user interactions are added to indicate mis-segmentation. Another CNN takes as input the user interactions with the initial segmentation and gives a refined result. We propose a dimension separate lightweight network that gives a faster and better dense predictions. In addition, we propose a mis-segmentation-based weighting strategy combined with loss functions to achieve more accurate segmentation. We validated the proposed framework in the context of 3D head and neck organism segmentation from CT images. Experimental results show our method achieves a large improvement from automatic CNNs, and obtains higher accuracy with fewer user interventions and less time compared with traditional interactive segmentation method.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Boykov, Y.Y., Jolly, M.P.: Interactive graph cuts for optimal boundary & region segmentation of objects in ND images. In: ICCV, pp. 105–112 (2001)
Çiçek, Ö., Abdulkadir, A., Lienkamp, S.S., Brox, T., Ronneberger, O.: 3D U-Net: learning dense volumetric segmentation from sparse annotation. In: Ourselin, S., Joskowicz, L., Sabuncu, M.R., Unal, G., Wells, W. (eds.) MICCAI 2016. LNCS, vol. 9901, pp. 424–432. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46723-8_49
Ibragimov, B., Xing, L.: Segmentation of organs-at-risks in head and neck CT images using convolutional neural networks. Med. Phys. 44(2), 547–557 (2017)
Raudaschl, P.F., et al.: Evaluation of segmentation methods on head and neck CT: auto-segmentation challenge 2015. Med. Phys. 44(5), 2020–2036 (2017)
Torre, L.A., Bray, F., Siegel, R.L., Ferlay, J., Lortet-Tieulent, J., Jemal, A.: Global cancer statistics, 2012. CA Cancer J. Clin. 65(2), 87–108 (2015)
Ulyanov, D., Vedaldi, A., Lempitsky, V.: Instance normalization: the missing ingredient for fast stylization. arXiv preprint arXiv:1607.08022 (2016)
Wang, G., et al.: Interactive medical image segmentation using deep learning with image-specific fine tuning. IEEE TMI 37(7), 1562–1573 (2018)
Wang, G., et al.: DeepiGeoS: a deep interactive geodesic framework for medical image segmentation. In: IEEE TPAMI (2018)
Wong, K.C., Moradi, M., Tang, H., Syeda-Mahmood, T.: 3D segmentation with exponential logarithmic loss for highly unbalanced object sizes. In: MICCAI, pp. 612–619 (2018)
Yushkevich, P.A., et al.: User-guided 3D active contour segmentation of anatomical structures: significantly improved efficiency and reliability. Neuroimage 31(3), 1116–1128 (2006)
Zhao, F., Xie, X.: An overview of interactive medical image segmentation. Ann. BMVA 2013(7), 1–22 (2013)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this paper
Cite this paper
Lei, W., Wang, H., Gu, R., Zhang, S., Zhang, S., Wang, G. (2019). DeepIGeoS-V2: Deep Interactive Segmentation of Multiple Organs from Head and Neck Images with Lightweight CNNs. In: Zhou, L., et al. Large-Scale Annotation of Biomedical Data and Expert Label Synthesis and Hardware Aware Learning for Medical Imaging and Computer Assisted Intervention. LABELS HAL-MICCAI CuRIOUS 2019 2019 2019. Lecture Notes in Computer Science(), vol 11851. Springer, Cham. https://doi.org/10.1007/978-3-030-33642-4_7
Download citation
DOI: https://doi.org/10.1007/978-3-030-33642-4_7
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-33641-7
Online ISBN: 978-3-030-33642-4
eBook Packages: Computer ScienceComputer Science (R0)