COAST: Intelligent Time-Adaptive Neural Operators

📅 2025-02-12
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Neural operators for dynamical systems often suffer from a trade-off between accuracy and efficiency due to fixed time stepping. To address this, we propose the first intelligent adaptive neural solver integrating a causal language model (CLM). Our method jointly models trajectory prediction and optimal step-size selection, leveraging the CLM’s capacity for intra-trajectory dynamic awareness and cross-system generalization—enabling step sizes to adapt to local dynamical complexity. The entire architecture is end-to-end differentiable, supporting joint optimization of prediction and stepping. Evaluated across diverse dynamical system benchmarks, our approach consistently outperforms state-of-the-art methods: achieving comparable or superior accuracy while reducing computational cost by 35%–62% on average. These results validate the effectiveness, robustness, and scalability of the “prediction-step co-optimization” paradigm.

Technology Category

Application Category

📝 Abstract
We introduce Causal Operator with Adaptive Solver Transformer (COAST), a novel neural operator learning method that leverages a causal language model (CLM) framework to dynamically adapt time steps. Our method predicts both the evolution of a system and its optimal time step, intelligently balancing computational efficiency and accuracy. We find that COAST generates variable step sizes that correlate with the underlying system intrinsicities, both within and across dynamical systems. Within a single trajectory, smaller steps are taken in regions of high complexity, while larger steps are employed in simpler regions. Across different systems, more complex dynamics receive more granular time steps. Benchmarked on diverse systems with varied dynamics, COAST consistently outperforms state-of-the-art methods, achieving superior performance in both efficiency and accuracy. This work underscores the potential of CLM-based intelligent adaptive solvers for scalable operator learning of dynamical systems.
Problem

Research questions and friction points this paper is trying to address.

Dynamically adapts time steps for system evolution.
Balances computational efficiency with accuracy.
Outperforms methods in efficiency and accuracy.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Adaptive time step prediction
Causal language model framework
Dynamic computational efficiency balance
🔎 Similar Papers
No similar papers found.
Z
Zhikai Wu
Peking University
S
Shiyang Zhang
Yale University
S
Sizhuang He
Yale University
Sifan Wang
Sifan Wang
Postdoctoral fellow, Yale University
Scientific Machine LearningAI for ScienceMachine LearningDeep Learning
M
Min Zhu
Yale University
A
Anran Jiao
Yale University
L
Lu Lu
Yale University
David van Dijk
David van Dijk
Assistant Professor, Yale University
machine learningcomputational biology