PGD-TO: A Scalable Alternative to MMA Using Projected Gradient Descent for Multi-Constraint Topology Optimization

📅 2025-11-17
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Projection gradient descent (PGD) in multi-constrained nonlinear topology optimization suffers from high computational cost and poor robustness due to complex active-set identification. Method: This paper proposes an active-set-free PGD framework. The projection step is reformulated as a regularized convex quadratic program, ensuring well-posedness even from infeasible initial points. For general multi-constraints, a semismooth Newton method is employed; for decoupled constraints, a bisection-based projection is introduced. Spectral step-size adaptation and nonlinear conjugate gradient directions are integrated to accelerate convergence. Results: On four canonical benchmark problems, the proposed method achieves convergence behavior and final performance comparable to MMA and OC, while attaining 10–43× speedup per iteration for general constraints and up to 115–312× for decoupled constraints. This significantly enhances computational efficiency and scalability for large-scale topology optimization.

Technology Category

Application Category

📝 Abstract
Projected Gradient Descent (PGD) methods offer a simple and scalable approach to topology optimization (TO), yet they often struggle with nonlinear and multi-constraint problems due to the complexity of active-set detection. This paper introduces PGD-TO, a framework that reformulates the projection step into a regularized convex quadratic problem, eliminating the need for active-set search and ensuring well-posedness even when constraints are infeasible. The framework employs a semismooth Newton solver for general multi-constraint cases and a binary search projection for single or independent constraints, achieving fast and reliable convergence. It further integrates spectral step-size adaptation and nonlinear conjugate-gradient directions for improved stability and efficiency. We evaluate PGD-TO on four benchmark families representing the breadth of TO problems: (i) minimum compliance with a linear volume constraint, (ii) minimum volume under a nonlinear compliance constraint, (iii) multi-material minimum compliance with four independent volume constraints, and (iv) minimum compliance with coupled volume and center-of-mass constraints. Across these single- and multi-constraint, linear and nonlinear cases, PGD-TO achieves convergence and final compliance comparable to the Method of Moving Asymptotes (MMA) and Optimality Criteria (OC), while reducing per-iteration computation time by 10-43x on general problems and 115-312x when constraints are independent. Overall, PGD-TO establishes a fast, robust, and scalable alternative to MMA, advancing topology optimization toward practical large-scale, multi-constraint, and nonlinear design problems. Public code available at: https://github.com/ahnobari/pyFANTOM
Problem

Research questions and friction points this paper is trying to address.

Addresses scalability issues in multi-constraint topology optimization
Eliminates complex active-set detection for nonlinear constraints
Provides fast convergence for large-scale design problems
Innovation

Methods, ideas, or system contributions that make the work stand out.

Reformulates projection as regularized convex quadratic problem
Uses semismooth Newton solver for multi-constraint cases
Integrates spectral step-size and conjugate-gradient directions