Abstract
Despite recent advances in deep learning based medical image computing, clinical implementations in patient-care settings have been limited with lack of sufficiently diverse data during training remaining a pivotal impediment to robust real-life model performance. Continual learning (CL) offers a desirable property of deep neural network models (DNNs), namely the ability to continually learn from new data to accumulate knowledge whilst retaining what has been previously learned. In this work we present a simple and effective CL approach for sequential multi-domain learning (MDL) and showcase its utility in the skin lesion image classification task. Specifically, we propose a new pruning criterion that allows for a fixed network to learn new data domains sequentially over time. Our MDL approach incrementally builds on knowledge gained from previously learned domains, without requiring access to their training data, while simultaneously avoiding catastrophic forgetting and maintaining accurate performance on all domain data learned. Our new pruning criterion detects culprit units associated with wrong classification in each domain and releases these units so they are dedicated for subsequent learning on new domains. To reduce the computational cost associated with retraining the network post pruning, we implement MergePrune, which efficiently merges the pruning and training stages into one step. Furthermore, at inference time, instead of using a test-time oracle, we design a smart gate using Siamese networks to assign a test image to the most appropriate domain and its corresponding learned model. We present extensive experiments on 6 skin lesion image databases, representing different domains with varying levels of data bias and class imbalance, including quantitative comparisons against multiple baselines and state-of-the-art methods, which demonstrate superior performance and efficient computations of our proposed method.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Aketi, S.A., Roy, S., Raghunathan, A., Roy, K.: Gradual channel pruning while training using feature relevance scores for convolutional neural networks. IEEE Access 8, 171924–171932 (2020)
Aljundi, R., Chakravarty, P., Tuytelaars, T.: Expert gate: lifelong learning with a network of experts. In: The IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 3366–3375 (2017)
Ballerini, L., Fisher, R.B., Aldridge, B., Rees, J.: A color and texture based hierarchical K-NN approach to the classification of non-melanoma skin lesions. In: Celebi, M., Schaefer, G. (eds.) Color Medical Image Analysis, pp. 63–86. Springer, Dordrecht (2013)
Bromley, J., Guyon, I., LeCun, Y., Säckinger, E., Shah, R.: Signature verification using a ‘siamese’ time delay neural network. In: Advances in Neural Information Processing Systems, p. 737 (1994)
Chen, Z., Liu, B.: Lifelong machine learning. In: Synthesis Lectures on Artificial Intelligence and Machine Learning, vol. 12, no. 3, pp. 1–207 (2018)
French, R.M.: Catastrophic forgetting in connectionist networks. Trends Cogn. Sci. 3(4), 128–135 (1999)
Golkar, S., Kagan, M., Cho, K.: Continual learning via neural pruning. arXiv preprint arXiv:1903.04476 (2019)
Guan, H., Liu, M.: Domain adaptation for medical image analysis: a survey. arXiv preprint arXiv:2102.09508 (2021)
Gutman, D., et al.: Skin lesion analysis toward melanoma detection: a challenge at the international symposium on biomedical imaging (ISBI) 2016, hosted by the international skin imaging collaboration (ISIC). arXiv abs/1605.01397 (2016)
Hofmanninger, J., Perkonigg, M., Brink, J.A., Pianykh, O., Herold, C., Langs, G.: Dynamic memory to alleviate catastrophic forgetting in continuous learning settings. In: Martel, A.L., et al. (eds.) MICCAI 2020. LNCS, vol. 12262, pp. 359–368. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-59713-9_35
Hung, S.C., Tu, C.H., Wu, C.E., Chen, C.H., Chan, Y.M., Chen, C.S.: Compacting, picking and growing for unforgetting continual learning. In: The Neural Information Processing Systems (NeurIPS) (2019)
Jung, S., Ahn, H., Cha, S., Moon, T.: Continual learning with node-importance based adaptive group sparse regularization. In: The Neural Information Processing Systems (NeurIPS) (2020)
Kawahara, J., Daneshvar, S., Argenziano, G., Hamarneh, G.: Seven-point checklist and skin lesion classification using multitask multimodal neural nets. IEEE J. Biomed. Health Inform. 23(2), 538–546 (2018)
Li, H., Kadav, A., Durdanovic, I., Samet, H., Graf, H.P.: Pruning filters for efficient convnets. In: The International Conference on Learning Representations (ICLR) (2017)
Mallya, A., Lazebnik, S.: PackNet: adding multiple tasks to a single network by iterative pruning. In: The IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 7765–7773 (2018)
Mendonca, T., Ferreira, P.M., Marques, J.S., Marcal, A.R., Rozeira, J.: PH2 - a dermoscopic image database for research and benchmarking. In: The IEEE Engineering in Medicine and Biology Society (EMBS), pp. 5437–5440 (2013)
Molchanov, P., Mallya, A., Tyree, S., Frosio, I., Kautz, J.: Importance estimation for neural network pruning. In: The IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 11264–11272 (2019)
Morcos, A.S., Barrett, D.G., Rabinowitz, N.C., Botvinick, M.: On the importance of single directions for generalization. In: The International Conference on Learning Representations (ICLR) (2018)
Rebuffi, S.A., Bilen, H., Vedaldi, A.: Efficient parametrization of multi-domain deep neural networks. In: The IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 8119–8127 (2018)
Senhaji, A., Raitoharju, J., Gabbouj, M., Iosifidis, A.: Not all domains are equally complex: adaptive multi-domain learning. arXiv preprint arXiv:2003.11504 (2020)
Torralba, A., Efros, A.A.: Unbiased look at dataset bias. In: The IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 1521–1528 (2011)
Tschandl, P., Rosendahl, C., Kittler, H.: The HAM10000 dataset, a large collection of multi-source dermatoscopic images of common pigmented skin lesions. Sci. Data 5(1), 1–9 (2018)
Yoon, C., Hamarneh, G., Garbi, R.: Generalizable feature learning in the presence of data bias and domain class imbalance with application to skin lesion classification. In: Shen, D., et al. (eds.) MICCAI 2019. LNCS, vol. 11767, pp. 365–373. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-32251-9_40
Yu, R., et al.: NISP: pruning networks using neuron importance score propagation. In: The IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 9194–9203 (2018)
Zhao, H., et al.: What and where: learn to plug adapters via NAS for multi-domain learning. arXiv preprint arXiv:2007.12415 (2020)
Zhou, B., Sun, Y., Bau, D., Torralba, A.: Revisiting the importance of individual units in CNNs via ablation. CoRR abs/1806.02891 (2018)
Acknowledgement
This work was funded by the NSERC Discovery program and Compute Canada.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this paper
Cite this paper
Bayasi, N., Hamarneh, G., Garbi, R. (2021). Culprit-Prune-Net: Efficient Continual Sequential Multi-domain Learning with Application to Skin Lesion Classification. In: de Bruijne, M., et al. Medical Image Computing and Computer Assisted Intervention – MICCAI 2021. MICCAI 2021. Lecture Notes in Computer Science(), vol 12907. Springer, Cham. https://doi.org/10.1007/978-3-030-87234-2_16
Download citation
DOI: https://doi.org/10.1007/978-3-030-87234-2_16
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-87233-5
Online ISBN: 978-3-030-87234-2
eBook Packages: Computer ScienceComputer Science (R0)