Semialgebraic Neural Networks: From roots to representations

📅 2025-01-02
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Efficiently and accurately modeling bounded semialgebraic functions—especially those with discontinuities—remains challenging in scientific computing. Method: This paper introduces the Structured Algebraic Neural Network (SANN), which integrates piecewise polynomial kernels with a homotopy continuation mechanism, embedding homotopy-based continuous deformation directly into the neural architecture for the first time. Contribution/Results: SANN enables exact representation and differentiable evaluation of arbitrary bounded semialgebraic functions. It approximates target functions up to the accuracy of numerical ODE solvers and supports end-to-end gradient-based training. Experiments demonstrate substantial improvements in both efficiency and accuracy on canonical scientific computing tasks, such as PDE inverse problems. By unifying symbolic structure with numerical learning, SANN establishes a novel paradigm for hybrid symbolic-numerical modeling via neural networks.

Technology Category

Application Category

📝 Abstract
Many numerical algorithms in scientific computing -- particularly in areas like numerical linear algebra, PDE simulation, and inverse problems -- produce outputs that can be represented by semialgebraic functions; that is, the graph of the computed function can be described by finitely many polynomial equalities and inequalities. In this work, we introduce Semialgebraic Neural Networks (SANNs), a neural network architecture capable of representing any bounded semialgebraic function, and computing such functions up to the accuracy of a numerical ODE solver chosen by the programmer. Conceptually, we encode the graph of the learned function as the kernel of a piecewise polynomial selected from a class of functions whose roots can be evaluated using a particular homotopy continuation method. We show by construction that the SANN architecture is able to execute this continuation method, thus evaluating the learned semialgebraic function. Furthermore, the architecture can exactly represent even discontinuous semialgebraic functions by executing a continuation method on each connected component of the target function. Lastly, we provide example applications of these networks and show they can be trained with traditional deep-learning techniques.
Problem

Research questions and friction points this paper is trying to address.

Neural Network Model
Semi-algebraic Functions
Scientific Computing
Innovation

Methods, ideas, or system contributions that make the work stand out.

SANNs
Algebraic functions approximation
Deep learning training
🔎 Similar Papers
No similar papers found.
S
S. David Mis
Department of Computational Applied Mathematics and Operations Research, Rice University, Houston, TX 77005, USA
Matti Lassas
Matti Lassas
Professor of applied mathematics, University of Helsinki, Finland
mathematics
M
Maarten V. de Hoop
Simons Chair in Computational Applied Mathematics and Earth Science, Rice University, Houston, TX 77005 USA