Semi-supervised Continual Learning with Meta Self-training
Abstract
References
Index Terms
- Semi-supervised Continual Learning with Meta Self-training
Recommendations
Inductive Semi-supervised Multi-Label Learning with Co-Training
KDD '17: Proceedings of the 23rd ACM SIGKDD International Conference on Knowledge Discovery and Data MiningIn multi-label learning, each training example is associated with multiple class labels and the task is to learn a mapping from the feature space to the power set of label space. It is generally demanding and time-consuming to obtain labels for training ...
Stacked co-training for semi-supervised multi-label learning
AbstractDue to the difficulty of annotation, multi-label learning sometimes obtains a small amount of labeled data and a large amount of unlabeled data as supplements. To make up this issue, many algorithms extended the existing semi-supervised ...
Continual semi-supervised learning through contrastive interpolation consistency
Highlights- We observe that in a continual scenario a fully-labeled stream is impractical.
- ...
AbstractContinual Learning (CL) investigates how to train Deep Networks on a stream of tasks without incurring forgetting. CL settings proposed in literature assume that every incoming example is paired with ground-truth annotations. However, ...
Comments
Please enable JavaScript to view thecomments powered by Disqus.Information & Contributors
Information
Published In
Sponsors
Publisher
Association for Computing Machinery
New York, NY, United States
Publication History
Check for updates
Author Tags
Qualifiers
- Short-paper
Conference
Acceptance Rates
Upcoming Conference
Contributors
Other Metrics
Bibliometrics & Citations
Bibliometrics
Article Metrics
- 0Total Citations
- 210Total Downloads
- Downloads (Last 12 months)52
- Downloads (Last 6 weeks)4
Other Metrics
Citations
Cited By
View allView Options
Get Access
Login options
Check if you have access through your login credentials or your institution to get full access on this article.
Sign in