🤖 AI Summary
Standard normalizing flows struggle to model fine-scale features in multiscale scientific data—such as broadband random fields or PDE solutions—often introducing numerical ill-conditioning and high reconstruction error. To address this, we propose Scale-Adaptive Flow (SA-Flow): a novel generative flow framework that, for the first time, aligns the noise spectral decay rate with the target data’s Fourier spectrum within a stochastic interpolation setting to regularize the initial drift field; and introduces a spectrum-aware adaptive time scheduling strategy to enable progressive, cross-scale feature modeling. SA-Flow integrates Fourier spectral analysis, non-Gaussian noise modeling, and normalizing flow architecture. Evaluated on Gaussian random fields and solutions to the Allen–Cahn and Navier–Stokes equations, it achieves significantly improved sample fidelity while reducing computational cost by 30–50% compared to conventional flow-based methods.
📝 Abstract
Flow-based generative models can face significant challenges when modeling scientific data with multiscale Fourier spectra, often producing large errors in fine-scale features. We address this problem within the framework of stochastic interpolants, via principled design of noise distributions and interpolation schedules. The key insight is that the noise should not be smoother than the target data distribution -- measured by Fourier spectrum decay rates -- to ensure bounded drift fields near the initial time. For Gaussian and near-Gaussian distributions whose fine-scale structure is known, we show that spectrum-matched noise improves numerical efficiency compared to standard white-noise approaches. For complex non-Gaussian distributions, we develop scale-adaptive interpolation schedules that address the numerical ill-conditioning arising from rougher-than-data noise. Numerical experiments on synthetic Gaussian random fields and solutions to the stochastic Allen-Cahn and Navier-Stokes equations validate our approach and demonstrate its ability to generate high-fidelity samples at lower computational cost than traditional approaches.