🤖 AI Summary
Conventional numerical simulations (e.g., FEM/CFD) for structural dynamic parameter optimization—mass (m), stiffness (k), and damping (c)—suffer from prohibitive computational cost. Method: This paper proposes an intelligent optimization framework integrating a Graph Neural Network (GNN) as a high-fidelity surrogate model with a Genetic Algorithm (GA) for global search. The GNN learns the nonlinear mapping between structural parameters and dynamic responses, trained on high-quality data generated via the Newmark-(eta) method. GA operates exclusively on the surrogate, eliminating costly iterative full-scale simulations. Contribution/Results: Experiments demonstrate that the framework reduces computational overhead by one to two orders of magnitude compared to full simulations, while maintaining robust convergence and high prediction accuracy (mean error < 3%). It significantly enhances both efficiency and generalizability in large-scale structural parameter optimization.
📝 Abstract
The optimization of structural parameters, such as mass(m), stiffness(k), and damping coefficient(c), is critical for designing efficient, resilient, and stable structures. Conventional numerical approaches, including Finite Element Method (FEM) and Computational Fluid Dynamics (CFD) simulations, provide high-fidelity results but are computationally expensive for iterative optimization tasks, as each evaluation requires solving the governing equations for every parameter combination. This study proposes a hybrid data-driven framework that integrates a Graph Neural Network (GNN) surrogate model with a Genetic Algorithm (GA) optimizer to overcome these challenges. The GNN is trained to accurately learn the nonlinear mapping between structural parameters and dynamic displacement responses, enabling rapid predictions without repeatedly solving the system equations. A dataset of single-degree-of-freedom (SDOF) system responses is generated using the Newmark Beta method across diverse mass, stiffness, and damping configurations. The GA then searches for globally optimal parameter sets by minimizing predicted displacements and enhancing dynamic stability. Results demonstrate that the GNN and GA framework achieves strong convergence, robust generalization, and significantly reduced computational cost compared to conventional simulations. This approach highlights the effectiveness of combining machine learning surrogates with evolutionary optimization for automated and intelligent structural design.