SpectraKAN: Conditioning Spectral Operators

📅 2026-02-05
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the limitations of existing spectral neural operators—such as the Fourier Neural Operator (FNO)—which employ static Fourier kernels and struggle to model multiscale, anisotropic, and state-dependent dynamics in partial differential equations (PDEs). We propose SpectraKAN, the first method to enable input-conditioned modulation of spectral operators: by extracting a compact global representation from spatiotemporal history and dynamically modulating a multiscale Fourier backbone via single-query cross-attention, it transforms static spectral convolutions into input-adaptive integral operators. We theoretically establish that SpectraKAN converges to a resolution-invariant continuous operator under mesh refinement and exhibits globally smooth, Lipschitz-controlled modulation. Empirically, it achieves state-of-the-art performance across multiple PDE benchmarks, reducing RMSE by up to 49% over strong baselines and demonstrating significant gains in complex spatiotemporal prediction tasks.

Technology Category

Application Category

📝 Abstract
Spectral neural operators, particularly Fourier Neural Operators (FNO), are a powerful framework for learning solution operators of partial differential equations (PDEs) due to their efficient global mixing in the frequency domain. However, existing spectral operators rely on static Fourier kernels applied uniformly across inputs, limiting their ability to capture multi-scale, regime-dependent, and anisotropic dynamics governed by the global state of the system. We introduce SpectraKAN, a neural operator that conditions the spectral operator on the input itself, turning static spectral convolution into an input-conditioned integral operator. This is achieved by extracting a compact global representation from spatio-temporal history and using it to modulate a multi-scale Fourier trunk via single-query cross-attention, enabling the operator to adapt its behaviour while retaining the efficiency of spectral mixing. We provide theoretical justification showing that this modulation converges to a resolution-independent continuous operator under mesh refinement and KAN gives smooth, Lipschitz-controlled global modulation. Across diverse PDE benchmarks, SpectraKAN achieves state-of-the-art performance, reducing RMSE by up to 49% over strong baselines, with particularly large gains on challenging spatio-temporal prediction tasks.
Problem

Research questions and friction points this paper is trying to address.

spectral neural operators
Fourier Neural Operators
multi-scale dynamics
anisotropic dynamics
input conditioning
Innovation

Methods, ideas, or system contributions that make the work stand out.

Spectral Neural Operators
Input-Conditioned Modulation
Fourier Neural Operators
Cross-Attention
Resolution-Independent Operator
🔎 Similar Papers
No similar papers found.
Chun-Wun Cheng
Chun-Wun Cheng
PhD student, University of Cambridge
Implicit Deep LearningApplied MathematicsGenerative AI
C
C. Schonlieb
DAMTP, University of Cambridge, Cambridge, UK
A
Angelica I. Avilés-Rivero
YMSC, Tsinghua University, Beijing, China