Large Language Models for Design Structure Matrix Optimization

📅 2025-06-11
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Design Structure Matrix (DSM) sequencing is an NP-hard combinatorial optimization problem aimed at minimizing feedback loops to enhance modularity and process efficiency. Method: This paper introduces, for the first time, large language models (LLMs) into DSM optimization. We propose a structured prompting framework that integrates network topology encoding with domain-specific knowledge injection, enabling iterative reasoning within the LLM optimizer—thereby overcoming semantic modeling and contextual understanding limitations inherent in conventional mathematical heuristics. Results: Evaluated on diverse real-world and synthetic DSM instances, our approach significantly outperforms random and deterministic baselines: convergence speed improves by 37%–62%, average feedback loop count decreases by 28.5%, and performance gains are model-agnostic. These results validate the effectiveness and generalizability of LLM-driven, domain-aware optimization.

Technology Category

Application Category

📝 Abstract
In complex engineering systems, the interdependencies among components or development activities are often modeled and analyzed using Design Structure Matrix (DSM). Reorganizing elements within a DSM to minimize feedback loops and enhance modularity or process efficiency constitutes a challenging combinatorial optimization (CO) problem in engineering design and operations. As problem sizes increase and dependency networks become more intricate, traditional optimization methods that solely use mathematical heuristics often fail to capture the contextual nuances and struggle to deliver effective solutions. In this study, we explore the potential of Large Language Models (LLMs) for helping solve such CO problems by leveraging their capabilities for advanced reasoning and contextual understanding. We propose a novel LLM-based framework that integrates network topology with contextual domain knowledge for iterative optimization of DSM element sequencing - a common CO problem. Experiments on various DSM cases show that our method consistently achieves faster convergence and superior solution quality compared to both stochastic and deterministic baselines. Notably, we find that incorporating contextual domain knowledge significantly enhances optimization performance regardless of the chosen LLM backbone. These findings highlight the potential of LLMs to solve complex engineering CO problems by combining semantic and mathematical reasoning. This approach paves the way towards a new paradigm in LLM-based engineering design optimization.
Problem

Research questions and friction points this paper is trying to address.

Optimizing Design Structure Matrix to minimize feedback loops
Solving combinatorial optimization in engineering with LLMs
Enhancing DSM modularity using contextual domain knowledge
Innovation

Methods, ideas, or system contributions that make the work stand out.

LLMs integrate network topology for DSM optimization
Contextual domain knowledge enhances optimization performance
Combines semantic and mathematical reasoning for CO
🔎 Similar Papers
No similar papers found.
S
Shuo Jiang
Department of Systems Engineering, City University of Hong Kong, Hong Kong
M
Min Xie
Department of Systems Engineering, City University of Hong Kong, Hong Kong
Jianxi Luo
Jianxi Luo
Professor, City University of Hong Kong
Innovation PrinciplesAI-Driven InnovationGenAI in Engineering Design