Abstract: This paper introduces a new framework for data efficient and versatile learning. Specifically:
1) We develop ML-PIP, a general framework for Meta-Learning approximate Probabilistic Inference for Prediction. ML-PIP extends existing probabilistic interpretations of meta-learning to cover a broad class of methods.
2) We introduce \Versa{}, an instance of the framework employing a flexible and versatile amortization network that takes few-shot learning datasets as inputs, with arbitrary numbers of shots, and outputs a distribution over task-specific parameters in a single forward pass. \Versa{} substitutes optimization at test time with forward passes through inference networks, amortizing the cost of inference and relieving the need for second derivatives during training.
3) We evaluate \Versa{} on benchmark datasets where the method sets new state-of-the-art results, and can handle arbitrary number of shots, and for classification, arbitrary numbers of classes at train and test time. The power of the approach is then demonstrated through a challenging few-shot ShapeNet view reconstruction task.
Keywords: probabilistic models, approximate inference, few-shot learning, meta-learning
TL;DR: Novel framework for meta-learning that unifies and extends a broad class of existing few-shot learning methods. Achieves strong performance on few-shot learning benchmarks without requiring iterative test-time inference.
Code: [![github](/images/github_icon.svg) Gordonjo/versa](https://github.com/Gordonjo/versa)
Data: [ShapeNet](https://paperswithcode.com/dataset/shapenet), [mini-Imagenet](https://paperswithcode.com/dataset/mini-imagenet)
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 2 code implementations](https://www.catalyzex.com/paper/arxiv:1805.09921/code)
16 Replies
Loading