🤖 AI Summary
This work addresses the high computational cost of multiscale topology optimization for hyperelastic materials, which arises from repeated solution of microscopic boundary value problems. The authors propose a concurrent multiscale optimization framework that employs physics-augmented neural networks (PANNs) as surrogate constitutive models to simultaneously optimize macroscopic material distribution and microscopic structural descriptors within a unified finite strain nonlinear setting. By embedding physical priors—such as convexity and material symmetry—into the network architecture and leveraging input-specific neural networks (ISNNs), invariant-based representations, structural tensors, and automatic differentiation, the approach ensures thermodynamic consistency and numerical stability. The method is successfully demonstrated on transversely isotropic, cubic anisotropic, and nearly incompressible materials, achieving substantial computational savings while accurately capturing multiscale coupling effects and enabling spatially tailored material performance.
📝 Abstract
Multiscale topology optimization (TO) of hyperelastic materials remains computationally prohibitive due to the repeated solution of microscale boundary value problems. In this work, we present a concurrent multiscale topology optimization framework that overcomes this limitation by leveraging physics-augmented neural networks (PANNs) as surrogate constitutive models. The proposed approach enables the simultaneous optimization of macroscale material distribution and microscale descriptors, within a unified nonlinear finite strain setting. The surrogate models are constructed using input-specific neural networks (ISNNs) that enforce key physical principles directly within the architecture, including convexity and material symmetry through invariant-based representations and structural tensors. This ensures thermodynamic consistency and numerical stability while accurately representing homogenized anisotropic hyperelastic responses. The trained PANNs replace the microscale boundary value problem and provide efficient evaluations of stresses and consistent tangent moduli using analytical first and second derivatives of the neural network, enabling tractable large-scale multiscale optimization. The framework is demonstrated on representative microstructures exhibiting transversely isotropic, cubic anisotropic, and nearly incompressible isotropic behavior. The results show that the proposed method captures complex multiscale interactions and enables physically meaningful spatial tailoring of material properties, while significantly reducing computational cost compared to classical FE$^2$ approaches. These findings establish PANNs as a powerful tool for high-fidelity multiscale topology optimization of nonlinear anisotropic materials.