COSMIC: Enabling Full-Stack Co-Design and Optimization of Distributed Machine Learning Systems

📅 2025-05-21
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Large-scale distributed machine learning systems face severe co-design challenges due to explosive cross-layer parameter space growth. This paper proposes a full-stack co-design framework that breaks away from conventional single-layer isolated optimization: first, it introduces the Parameter Space Architecture (PSA) abstraction to unify the configuration spaces across hardware, communication, software framework, and algorithmic layers; second, it constructs an agent-driven end-to-end simulation environment enabling discovery of high-performance configurations without explicit manual tuning. Evaluated on four Transformer models—including one with up to 175 billion parameters—the framework successfully identifies eight high-performance quantization configurations, achieving 1.50×–48.41× end-to-end speedup over state-of-the-art single-layer optimizations. To our knowledge, this is the first work to realize joint four-layer optimization and PSA-guided automated configuration search, establishing a novel paradigm for large-model system design.

Technology Category

Application Category

📝 Abstract
Large-scale machine learning models necessitate distributed systems, posing significant design challenges due to the large parameter space across distinct design stacks. Existing studies often focus on optimizing individual system aspects in isolation. This work challenges this limitation and introduces COSMIC, a full-stack distributed machine learning systems environment enabling end-to-end simulation and agent-based design space exploration. To facilitate efficient exploration and optimization across the entire stack, we introduce Parameter Set Architecture-an abstraction concept analogous to the instruction set architecture-abstracting away configuration complexities of agent-based search methods. Case studies demonstrate COSMIC's ability to consolidate parameters across multiple layers of design abstraction, discovering eight non-obvious high-performance system configurations across four transformer-based models with up to 175 billion parameters. By optimizing across the stack, COSMIC full-stack optimization delivers 1.50-48.41x higher performance compared to the isolated single-stack optimization.
Problem

Research questions and friction points this paper is trying to address.

Optimizing distributed ML systems across full-stack design layers
Enabling end-to-end co-design via simulation and agent-based exploration
Abstracting configuration complexities for multi-layer system optimization
Innovation

Methods, ideas, or system contributions that make the work stand out.

Full-stack co-design for distributed ML systems
Parameter Set Architecture simplifies configuration
Agent-based exploration optimizes multi-layer parameters
🔎 Similar Papers
No similar papers found.