🤖 AI Summary
This work addresses few-shot meta-learning by proposing a compositional meta-learning framework grounded in probabilistic task inference. The method explicitly models tasks as structured compositions of reusable computational modules (e.g., functions or subroutines) and employs a generative probabilistic model to learn statistical priors over modules and their compositional regularities across a task distribution. Solving a new task is formulated as Bayesian structural inference—requiring no gradient-based adaptation—and enables rapid module identification and recombination from minimal data (e.g., one shot) via hypothesis testing alone. By unifying the representational power of neural networks with the data efficiency of probabilistic inference, the approach accurately recovers ground-truth modular structure and statistical dependencies in rule learning and robotic motion control benchmarks. It achieves substantial improvements in cross-task generalization and adaptation speed compared to gradient-based meta-learners.
📝 Abstract
To solve a new task from minimal experience, it is essential to effectively reuse knowledge from previous tasks, a problem known as meta-learning. Compositional solutions, where common elements of computation are flexibly recombined into new configurations, are particularly well-suited for meta-learning. Here, we propose a compositional meta-learning model that explicitly represents tasks as structured combinations of reusable computations. We achieve this by learning a generative model that captures the underlying components and their statistics shared across a family of tasks. This approach transforms learning a new task into a probabilistic inference problem, which allows for finding solutions without parameter updates through highly constrained hypothesis testing. Our model successfully recovers ground truth components and statistics in rule learning and motor learning tasks. We then demonstrate its ability to quickly infer new solutions from just single examples. Together, our framework joins the expressivity of neural networks with the data-efficiency of probabilistic inference to achieve rapid compositional meta-learning.