PCG-Informed Neural Solvers for High-Resolution Homogenization of Periodic Microstructures

📅 2025-06-20
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the high computational cost of traditional numerical solvers and the limited generalizability and physical consistency of existing learning-based methods in high-resolution homogenization of periodic microstructures, this paper proposes the PCG-informed Neural Solver (CGINS). Its key contributions are: (1) a novel network initialization scheme embedding preconditioned conjugate gradient (PCG) iterations, coupled with a label-free, minimum potential energy self-supervised loss; (2) a sparse periodic 3D convolutional operator that rigorously enforces structural periodicity; and (3) a multi-scale architecture incorporating global displacement constraints to eliminate reliance on Dirichlet boundary conditions. Evaluated at 512³ resolution, CGINS achieves <1% relative error and accelerates computation by 2–10× over GPU-accelerated numerical solvers, while attaining state-of-the-art accuracy and cross-configuration generalization capability.

Technology Category

Application Category

📝 Abstract
The mechanical properties of periodic microstructures are pivotal in various engineering applications. Homogenization theory is a powerful tool for predicting these properties by averaging the behavior of complex microstructures over a representative volume element. However, traditional numerical solvers for homogenization problems can be computationally expensive, especially for high-resolution and complicated topology and geometry. Existing learning-based methods, while promising, often struggle with accuracy and generalization in such scenarios. To address these challenges, we present CGINS, a preconditioned-conjugate-gradient-solver-informed neural network for solving homogenization problems. CGINS leverages sparse and periodic 3D convolution to enable high-resolution learning while ensuring structural periodicity. It features a multi-level network architecture that facilitates effective learning across different scales and employs minimum potential energy as label-free loss functions for self-supervised learning. The integrated preconditioned conjugate gradient iterations ensure that the network provides PCG-friendly initial solutions for fast convergence and high accuracy. Additionally, CGINS imposes a global displacement constraint to ensure physical consistency, addressing a key limitation in prior methods that rely on Dirichlet anchors. Evaluated on large-scale datasets with diverse topologies and material configurations, CGINS achieves state-of-the-art accuracy (relative error below 1%) and outperforms both learning-based baselines and GPU-accelerated numerical solvers. Notably, it delivers 2 times to 10 times speedups over traditional methods while maintaining physically reliable predictions at resolutions up to $512^3$.
Problem

Research questions and friction points this paper is trying to address.

Solving high-resolution homogenization of periodic microstructures efficiently
Overcoming accuracy and generalization issues in learning-based methods
Reducing computational costs of traditional numerical solvers for homogenization
Innovation

Methods, ideas, or system contributions that make the work stand out.

Preconditioned-conjugate-gradient-informed neural network for homogenization
Sparse periodic 3D convolution enabling high-resolution learning
Multi-level architecture with energy-based self-supervised learning
🔎 Similar Papers
No similar papers found.