Nothing Special   »   [go: up one dir, main page]

×
Please click here if you are not redirected within a few seconds.
Jun 11, 2020 · Here we investigate self-training as another method to utilize additional data on the same setup and contrast it against ImageNet pre-training.
An increasingly popular pre-training method is self-supervised learning. Self-supervised learning methods pre-train on a dataset without using labels with the ...
Self-training works well exactly on the same setup that pre-training does not work (using ImageNet to help COCO), and on the PASCAL segmentation dataset, ...
Dec 6, 2020 · Here we investigate self-training as another method to utilize additional data on the same setup and contrast it against ImageNet pre-training.
This paper investigates self-training methods for utilizing additional data and compares self-training against supervised/self-supervised pre-training methods.
Oct 8, 2021 · This paper goes into more depth than those papers — covering data augmentation, different pre-training methods, and different pre-trained ...
Jun 16, 2020 · Here we investigate self-training as another method to utilize additional data on the same setup and contrast it against ImageNet pre-training.
Self-training works across dataset sizes and is additive to pre-training. Next we analyze the performance of self-training as we vary the COCO labeled dataset ...
Jun 28, 2020 · This new paper not only talks about pre-training but also investigates self-training and how it compares to pre-training and self-supervised learning for the ...