Meta-neural Topology Optimization: Knowledge Infusion with Meta-learning

📅 2025-02-03
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Topology optimization (TO) lacks cross-task knowledge transfer capability, necessitating de novo iterative design for each new problem and incurring high computational cost. This paper proposes Meta-Neural TO, the first framework to integrate meta-learning—specifically a variant of Model-Agnostic Meta-Learning (MAML)—into TO, coupled with SIREN-based implicit neural representations for mesh-independent initial design generation. Its core contributions are: (i) identifying strain energy distribution as a physics-informed prior to guide the construction of high-quality initial density fields; and (ii) enabling cross-resolution knowledge transfer, allowing high-resolution tasks to be initialized without retraining. Experiments demonstrate that Meta-Neural TO accelerates convergence in 74.1% of high-resolution tasks, reduces average iteration count by 33.6%, and achieves final design quality comparable to conventional TO methods.

Technology Category

Application Category

📝 Abstract
Engineers learn from every design they create, building intuition that helps them quickly identify promising solutions for new problems. Topology optimization (TO) - a well-established computational method for designing structures with optimized performance - lacks this ability to learn from experience. Existing approaches treat design tasks in isolation, starting from a"blank canvas"design for each new problem, often requiring many computationally expensive steps to converge. We propose a meta-learning strategy, termed meta-neural TO, that finds effective initial designs through a systematic transfer of knowledge between related tasks, building on the mesh-agnostic representation provided by neural reparameterization. We compare our approach against established TO methods, demonstrating efficient optimization across diverse test cases without compromising design quality. Further, we demonstrate powerful cross-resolution transfer capabilities, where initializations learned on lower-resolution discretizations lead to superior convergence in 74.1% of tasks on a higher-resolution test set, reducing the average number of iterations by 33.6% compared to standard neural TO. Remarkably, we discover that meta-learning naturally gravitates toward the strain energy patterns found in uniform density designs as effective starting points, aligning with engineering intuition.
Problem

Research questions and friction points this paper is trying to address.

Enhance topology optimization learning
Transfer knowledge across tasks
Improve convergence with meta-learning
Innovation

Methods, ideas, or system contributions that make the work stand out.

Meta-learning for topology optimization
Knowledge transfer between design tasks
Cross-resolution transfer capabilities
🔎 Similar Papers
No similar papers found.