Understanding Generalization in Physics Informed Models through Affine Variety Dimensions

๐Ÿ“… 2025-01-31
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
This study investigates the generalization mechanisms of physics-informed machine learning models. Addressing the theoretical limitation that existing frameworks fail to characterize how physical constraints fundamentally enhance generalization, we propose a novel analytical framework grounded in algebraic geometry. Specifically, we establish, for the first time, a quantitative relationship between generalization error and the dimension of the affine variety defined by the physical constraints. We prove that, for linear regressors embedded with differential equation structures, generalization performance is governed by this geometric dimensionโ€”not by classical parameter count. Methodologically, our approach integrates algebraic geometry (affine variety dimension analysis), physics-based modeling, and numerical approximation, and introduces a computationally tractable algorithm for estimating the variety dimension. Experiments demonstrate that this dimension accurately predicts generalization behavior across diverse physical constraints, providing a unified, interpretable, and computationally feasible theoretical foundation for both linear and nonlinear physics-informed learning.

Technology Category

Application Category

๐Ÿ“ Abstract
In recent years, physics-informed machine learning has gained significant attention for its ability to enhance statistical performance and sample efficiency by integrating physical structures into machine learning models. These structures, such as differential equations, conservation laws, and symmetries, serve as inductive biases that can improve the generalization capacity of the hybrid model. However, the mechanisms by which these physical structures enhance generalization capacity are not fully understood, limiting the ability to guarantee the performance of the models. In this study, we show that the generalization performance of linear regressors incorporating differential equation structures is determined by the dimension of the associated affine variety, rather than the number of parameters. This finding enables a unified analysis of various equations, including nonlinear ones. We introduce a method to approximate the dimension of the affine variety and provide experimental evidence to validate our theoretical insights.
Problem

Research questions and friction points this paper is trying to address.

Physical Information
Machine Learning Models
Performance Enhancement
Innovation

Methods, ideas, or system contributions that make the work stand out.

Affine Manifold Dimension
Generalization Ability
Nonlinear Equation
๐Ÿ”Ž Similar Papers
No similar papers found.
T
Takeshi Koshizuka
Department of Computer Science, The University of Tokyo
Issei Sato
Issei Sato
University of Tokyo
Machine learning