Neural Variational Dropout Processes

📅 2025-10-22
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the challenge of rapid adaptation to novel tasks in few-shot multitask learning, this paper proposes a Bayesian meta-learning framework based on neural variational Dropout. The method introduces: (1) a task-conditioned low-rank Bernoulli expert meta-model that dynamically predicts lightweight, memory-efficient Dropout rates; (2) a task-dependent prior informed by full-task data, jointly modeling parameter uncertainty and functional ambiguity; and (3) variational inference to jointly optimize the shared backbone and task-specific Dropout configurations, enabling plug-and-play network reconstruction without fine-tuning. Evaluated on 1D stochastic regression, image inpainting, and few-shot classification, the approach significantly improves generalization and inference robustness while reducing parameter redundancy.

Technology Category

Application Category

📝 Abstract
Learning to infer the conditional posterior model is a key step for robust meta-learning. This paper presents a new Bayesian meta-learning approach called Neural Variational Dropout Processes (NVDPs). NVDPs model the conditional posterior distribution based on a task-specific dropout; a low-rank product of Bernoulli experts meta-model is utilized for a memory-efficient mapping of dropout rates from a few observed contexts. It allows for a quick reconfiguration of a globally learned and shared neural network for new tasks in multi-task few-shot learning. In addition, NVDPs utilize a novel prior conditioned on the whole task data to optimize the conditional extit{dropout} posterior in the amortized variational inference. Surprisingly, this enables the robust approximation of task-specific dropout rates that can deal with a wide range of functional ambiguities and uncertainties. We compared the proposed method with other meta-learning approaches in the few-shot learning tasks such as 1D stochastic regression, image inpainting, and classification. The results show the excellent performance of NVDPs.
Problem

Research questions and friction points this paper is trying to address.

Learning robust Bayesian meta-learning through conditional posteriors
Efficiently adapting neural networks for few-shot tasks
Handling functional ambiguities via task-specific dropout rates
Innovation

Methods, ideas, or system contributions that make the work stand out.

Bayesian meta-learning with Neural Variational Dropout Processes
Low-rank Bernoulli experts meta-model for memory efficiency
Task-specific dropout rates optimized through amortized variational inference
🔎 Similar Papers
No similar papers found.
I
Insu Jeon
Seoul National University
Y
Youngjin Park
Everdoubling LLC., Seoul, South Korea
Gunhee Kim
Gunhee Kim
Professor, Seoul National University
Computer VisionMachine LearningNatural Language Processing