๐ค AI Summary
This paper addresses the challenge of modeling stochastic functions from sparse context-target pairs. Existing approaches suffer from poor scalability (e.g., Gaussian processes), overly restrictive distributional assumptions (e.g., Gaussianity), or limited multimodal representation capacity (e.g., neural processes, standard neural diffusion). To overcome these limitations, we propose the Neural Bridge Process (NBP), a novel diffusion-based framework. NBP treats input contexts as dynamic anchors along diffusion trajectories, explicitly conditioning the forward transition kernel on context and enforcing constrained bridge paths that strictly terminate at supervised targetsโthereby strengthening gradient signals and ensuring endpoint semantic consistency. Built upon the DDPM framework, NBP integrates bridge sampling with neural process principles to achieve strong input-coupled functional distribution modeling. Extensive experiments on synthetic data, EEG regression, and image regression demonstrate significant improvements over state-of-the-art baselines, validating both theoretical rigor and empirical superiority.
๐ Abstract
Learning stochastic functions from partially observed context-target pairs is a fundamental problem in probabilistic modeling. Traditional models like Gaussian Processes (GPs) face scalability issues with large datasets and assume Gaussianity, limiting their applicability. While Neural Processes (NPs) offer more flexibility, they struggle with capturing complex, multi-modal target distributions. Neural Diffusion Processes (NDPs) enhance expressivity through a learned diffusion process but rely solely on conditional signals in the denoising network, resulting in weak input coupling from an unconditional forward process and semantic mismatch at the diffusion endpoint. In this work, we propose Neural Bridge Processes (NBPs), a novel method for modeling stochastic functions where inputs x act as dynamic anchors for the entire diffusion trajectory. By reformulating the forward kernel to explicitly depend on x, NBP enforces a constrained path that strictly terminates at the supervised target. This approach not only provides stronger gradient signals but also guarantees endpoint coherence. We validate NBPs on synthetic data, EEG signal regression and image regression tasks, achieving substantial improvements over baselines. These results underscore the effectiveness of DDPM-style bridge sampling in enhancing both performance and theoretical consistency for structured prediction tasks.