STNet: Spectral Transformation Network for Solving Operator Eigenvalue Problem

📅 2025-10-27
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Traditional numerical methods for operator eigenvalue problems suffer from the curse of dimensionality, while existing deep learning approaches exhibit poor generalization and accuracy heavily dependent on spectral distribution. To address these challenges, this paper proposes a novel neural-network-based iterative optimization framework. Its core innovation is a learnable spectral transformation mechanism that jointly performs contraction projection and filtering transformations, dynamically reformulating the original problem into an equivalent yet more tractable form. This design avoids local convergence to known eigenfunctions and enhances resolution within target spectral regions. By integrating neural network approximation with differentiable spectral operations, the method enables efficient and stable iterative computation of high-dimensional operator eigenstructures. Experiments demonstrate that our approach significantly outperforms state-of-the-art learning-based methods in accuracy—achieving new best performance—while exhibiting strong generalization capability and robustness across diverse spectral configurations.

Technology Category

Application Category

📝 Abstract
Operator eigenvalue problems play a critical role in various scientific fields and engineering applications, yet numerical methods are hindered by the curse of dimensionality. Recent deep learning methods provide an efficient approach to address this challenge by iteratively updating neural networks. These methods' performance relies heavily on the spectral distribution of the given operator: larger gaps between the operator's eigenvalues will improve precision, thus tailored spectral transformations that leverage the spectral distribution can enhance their performance. Based on this observation, we propose the Spectral Transformation Network (STNet). During each iteration, STNet uses approximate eigenvalues and eigenfunctions to perform spectral transformations on the original operator, turning it into an equivalent but easier problem. Specifically, we employ deflation projection to exclude the subspace corresponding to already solved eigenfunctions, thereby reducing the search space and avoiding converging to existing eigenfunctions. Additionally, our filter transform magnifies eigenvalues in the desired region and suppresses those outside, further improving performance. Extensive experiments demonstrate that STNet consistently outperforms existing learning-based methods, achieving state-of-the-art performance in accuracy.
Problem

Research questions and friction points this paper is trying to address.

Solving operator eigenvalue problems with deep learning
Overcoming dimensionality curse in eigenvalue computation
Enhancing precision through spectral transformation techniques
Innovation

Methods, ideas, or system contributions that make the work stand out.

Spectral transformation network for operator eigenvalue problems
Deflation projection reduces search space for eigenfunctions
Filter transform magnifies eigenvalues in desired regions
🔎 Similar Papers
No similar papers found.
H
Hong Wang
University of Science and Technology of China
J
Jiang Yixuan
Tsinghua University
J
Jie Wang
University of Science and Technology of China
X
Xinyi Li
University of Science and Technology of China
Jian Luo
Jian Luo
University of California San Diego
Materials ScienceCeramicsGrain Boundary
H
Huanshuo Dong
University of Science and Technology of China