Learning the Stellar Structure Equations via Self-supervised Physics-Informed Neural Networks

📅 2026-04-06
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study addresses the high computational cost and poor scalability of traditional stellar structure solvers like MESA in large-scale population synthesis. The authors propose a self-supervised physics-informed neural network (PINN) that directly solves the stellar structure equations in a mesh-free, fully differentiable manner, producing continuous radial profiles of physical quantities from only boundary conditions and chemical composition as inputs. The approach is novel in being the first fully self-supervised framework requiring no training data, and it incorporates differentiable neural surrogates for the equation of state and opacity tables, enabling end-to-end optimization. Validated across multiple stellar masses, the model achieves an average relative absolute error of 3.06% and an R² of 99.98%, demonstrating excellent agreement with MESA results.
📝 Abstract
Stellar astrophysics relies critically on accurate descriptions of the physical conditions inside stars. Traditional solvers such as \texttt{MESA} (Modules for Experiments in Stellar Astrophysics), which employ adaptive finite-difference methods, can become computationally expensive and challenging to scale for large stellar population synthesis ($>10^9$ stars). In this work, we present an self-supervised physics-informed neural network (PINN) framework that provides a mesh-free and fully differentiable approach to solving the stellar structure equations under hydrostatic and thermal equilibrium. The model takes as input the stellar boundary conditions (at the center and surface) together with the chemical composition, and learns continuous radial profiles for mass $M_r(r)$, pressure $P(r)$, density $ρ(r)$, temperature $T(r)$, and luminosity $L_r(r)$ by enforcing the governing structure equations through physics-based loss terms. To incorporate realistic microphysics, we introduce auxiliary neural networks that approximate the equation of state and opacity tables as smooth, differentiable functions of the local thermodynamic state. These surrogates replace traditional tabulated inputs and enable end-to-end training. Once trained for a given star, the model produces continuous solutions across the entire radial domain without requiring discretization or interpolation. Validation against benchmark \texttt{MESA} models across a range of stellar masses yields a Mean Relative Absolute Error of $3.06\%$ and an average $R^2$ score of $99.98\%$. To our knowledge, this is the first demonstration that the stellar structure equations can be solved in a fully self-supervised and data-free fashion employing PINNs. This work establishes a foundation for scalable, physics-informed emulation of stellar interiors and opens the door to future extensions toward time-dependent stellar evolution.
Problem

Research questions and friction points this paper is trying to address.

stellar structure equations
computational scalability
stellar population synthesis
physics-informed neural networks
self-supervised learning
Innovation

Methods, ideas, or system contributions that make the work stand out.

Physics-Informed Neural Networks
Stellar Structure Equations
Self-supervised Learning
Differentiable Microphysics
Mesh-free Solvers
🔎 Similar Papers
No similar papers found.
M
Manuel Ballester
SkAI Institute (NSF–Simons AI Institute for the Sky), Chicago, IL, USA
S
Santiago Lopez-Tapia
Department of Electrical and Computer Engineering, Northwestern University, Chicago, IL, USA
S
Seth Gossage
SkAI Institute (NSF–Simons AI Institute for the Sky), Chicago, IL, USA; CIERA, Northwestern University, Chicago, IL, USA; Department of Physics and Astronomy, Northwestern University, Chicago, IL, USA
P
Patrick Koller
SkAI Institute (NSF–Simons AI Institute for the Sky), Chicago, IL, USA; Department of Electrical and Computer Engineering, Northwestern University, Chicago, IL, USA
P
Philipp M. Srivastava
Department of Electrical and Computer Engineering, Northwestern University, Chicago, IL, USA
Ugur Demir
Ugur Demir
Northwestern University
Computer VisionMachine LearningDeep LearningArtificial Neural Networks
Y
Yongseok Jo
SkAI Institute (NSF–Simons AI Institute for the Sky), Chicago, IL, USA
A
Almudena P. Marquez
Department of Mathematics, University of Cadiz, Cadiz, Spain
C
Christoph Wuersch
OST Eastern Switzerland University of Applied Sciences, Switzerland
Souvik Chakraborty
Souvik Chakraborty
Assistant Professor, IIT Delhi
Deep learningDigital TwinStochastic MechanicsReliability AnalysisReduced order models
V
Vicky Kalogera
SkAI Institute (NSF–Simons AI Institute for the Sky), Chicago, IL, USA; CIERA, Northwestern University, Chicago, IL, USA; Department of Physics and Astronomy, Northwestern University, Chicago, IL, USA
A
Aggelos Katsaggelos
SkAI Institute (NSF–Simons AI Institute for the Sky), Chicago, IL, USA; Department of Electrical and Computer Engineering, Northwestern University, Chicago, IL, USA