HGAN-SDEs: Learning Neural Stochastic Differential Equations with Hermite-Guided Adversarial Training

📅 2025-12-23
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the high computational cost and training instability of path discriminators in Neural Stochastic Differential Equation (SDE)-based generative modeling, this paper proposes a lightweight continuous-time discriminator built upon the Hermite function basis, establishing the first Hermite-guided adversarial training framework tailored to SDE-induced path distributions. Theoretically, we provide the first universal approximation and convergence guarantees for this discriminator with respect to SDE path measures. Methodologically, we explicitly capture temporal dependencies in sample paths via Hermite series expansions, drastically reducing both parameter count and computational overhead. Experiments on synthetic and real-world datasets demonstrate substantial improvements in sample quality, enhanced training stability, and over 40% acceleration in inference speed—outperforming state-of-the-art SDE-based generative methods in overall performance.

Technology Category

Application Category

📝 Abstract
Neural Stochastic Differential Equations (Neural SDEs) provide a principled framework for modeling continuous-time stochastic processes and have been widely adopted in fields ranging from physics to finance. Recent advances suggest that Generative Adversarial Networks (GANs) offer a promising solution to learning the complex path distributions induced by SDEs. However, a critical bottleneck lies in designing a discriminator that faithfully captures temporal dependencies while remaining computationally efficient. Prior works have explored Neural Controlled Differential Equations (CDEs) as discriminators due to their ability to model continuous-time dynamics, but such architectures suffer from high computational costs and exacerbate the instability of adversarial training. To address these limitations, we introduce HGAN-SDEs, a novel GAN-based framework that leverages Neural Hermite functions to construct a structured and efficient discriminator. Hermite functions provide an expressive yet lightweight basis for approximating path-level dynamics, enabling both reduced runtime complexity and improved training stability. We establish the universal approximation property of our framework for a broad class of SDE-driven distributions and theoretically characterize its convergence behavior. Extensive empirical evaluations on synthetic and real-world systems demonstrate that HGAN-SDEs achieve superior sample quality and learning efficiency compared to existing generative models for SDEs
Problem

Research questions and friction points this paper is trying to address.

Design efficient discriminator for Neural SDEs
Reduce computational cost and training instability
Improve sample quality and learning efficiency
Innovation

Methods, ideas, or system contributions that make the work stand out.

Hermite functions create efficient discriminator for GANs
Reduces runtime complexity and improves training stability
Universal approximation for SDE-driven distributions proven
🔎 Similar Papers
No similar papers found.
Y
Yuanjian Xu
Thrust of Financial Technology, The Hong Kong University of Science and Technology (Guangzhou)
Y
Yuan Shuai
School of Software and Microelectronics, Peking University
Jianing Hao
Jianing Hao
The Hong Kong University of Science and Technology (Guangzhou)
Human-AI collaborationTime-series representationVisual analysisRecommendation system
G
Guang Zhang
Thrust of Financial Technology, The Hong Kong University of Science and Technology (Guangzhou)