Polyconvex Physics-Augmented Neural Network Constitutive Models in Principal Stretches

📅 2025-03-01
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Addressing the challenge of simultaneously ensuring physical consistency and predictive accuracy in constitutive modeling of soft materials, this paper proposes a physics-guided deep learning framework based on principal stretches. Methodologically, it introduces the eigenvalues of the right stretch tensor ( mathbf{U} ) (i.e., principal stretches) as direct inputs to an input-convex neural network (ICNN), and constructs a hierarchical convex potential using ( mathbf{U} ), ( operatorname{cof} mathbf{U} ), and the Jacobian determinant ( J ), thereby rigorously enforcing polynomial convexity, frame-indifference, and thermodynamic consistency. Crucially, physical constraints are embedded intrinsically into the network architecture—eliminating the need for post-hoc regularization. Validation on both synthetic and experimental data demonstrates that the model achieves high predictive accuracy across diverse hyperelastic materials, significantly outperforming conventional invariant-based models while inherently guaranteeing physically admissible constitutive responses.

Technology Category

Application Category

📝 Abstract
Accurate constitutive models of soft materials are crucial for understanding their mechanical behavior and ensuring reliable predictions in the design process. To this end, scientific machine learning research has produced flexible and general material model architectures that can capture the behavior of a wide range of materials, reducing the need for expert-constructed closed-form models. The focus has gradually shifted towards embedding physical constraints in the network architecture to regularize these over-parameterized models. Two popular approaches are input convex neural networks (ICNN) and neural ordinary differential equations (NODE). A related alternative has been the generalization of closed-form models, such as sparse regression from a large library. Remarkably, all prior work using ICNN or NODE uses the invariants of the Cauchy-Green tensor and none uses the principal stretches. In this work, we construct general polyconvex functions of the principal stretches in a physics-aware deep-learning framework and offer insights and comparisons to invariant-based formulations. The framework is based on recent developments to characterize polyconvex functions in terms of convex functions of the right stretch tensor $mathbf{U}$, its cofactor $ ext{cof}mathbf{U}$, and its determinant $J$. Any convex function of a symmetric second-order tensor can be described with a convex and symmetric function of its eigenvalues. Thus, we first describe convex functions of $mathbf{U}$ and $ ext{cof}mathbf{U}$ in terms of their respective eigenvalues using deep Holder sets composed with ICNN functions. A third ICNN takes as input $J$ and the two convex functions of $mathbf{U}$ and $ ext{cof}mathbf{U}$, and returns the strain energy as output. The ability of the model to capture arbitrary materials is demonstrated using synthetic and experimental data.
Problem

Research questions and friction points this paper is trying to address.

Develops polyconvex neural network models for soft materials.
Uses principal stretches instead of Cauchy-Green tensor invariants.
Demonstrates model accuracy with synthetic and experimental data.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Polyconvex functions using principal stretches
Physics-aware deep-learning framework
ICNN for strain energy computation
🔎 Similar Papers
No similar papers found.
A
Adrian Buganza Tepole
Columbia University, New York City, NY, USA
Asghar Jadoon
Asghar Jadoon
PhD Student, UT Austin
Computational solid mechanicsPhysics-informed machine learningPlasticity
Manuel Rausch
Manuel Rausch
UT Austin
BiomechanicsComputational ModelingSoft TissueImaging
J
Jan N. Fuhg
The University of Texas at Austin, Austin TX, USA