%0 Unpublished work %T Move-to-Data: A new Continual Learning approach with Deep CNNs, Application for image-class recognition * %+ Laboratoire Bordelais de Recherche en Informatique (LaBRI) %+ Institut de Neurosciences cognitives et intégratives d'Aquitaine (INCIA) %A Poursanidis, Miltiadis %A Benois-Pineau, Jenny %A Zemmari, Akka %A Mansencal, Boris %A de Rugy, Aymar %8 2020-06-12 %D 2020 %Z 2006.07152 %Z Computer Science [cs]/Artificial Intelligence [cs.AI] %Z Computer Science [cs]/Computer Vision and Pattern Recognition [cs.CV] %Z Computer Science [cs]/Machine Learning [cs.LG] %Z Computer Science [cs]/Neural and Evolutionary Computing [cs.NE]Preprints, Working Papers, ... %X In many real-life tasks of application of supervised learning approaches, all the training data are not available at the same time. The examples are lifelong image classification or recognition of environmental objects during interaction of instrumented persons with their environment, enrichment of an online-database with more images. It is necessary to pre-train the model at a "training recording phase" and then adjust it to the new coming data. This is the task of incremental/continual learning approaches. Amongst different problems to be solved by these approaches such as introduction of new categories in the model, refining existing categories to sub-categories and extending trained classifiers over them, ... we focus on the problem of adjusting pre-trained model with new additional training data for existing categories. We propose a fast continual learning layer at the end of the neuronal network. Obtained results are illustrated on the opensource CIFAR benchmark dataset. The proposed scheme yields similar performances as retraining but with drastically lower computational cost. %G English %2 https://hal.science/hal-02865878v1/document %2 https://hal.science/hal-02865878v1/file/main.pdf %L hal-02865878 %U https://hal.science/hal-02865878 %~ CNRS %~ AIV %~ TEST-HALCNRS