🤖 AI Summary
This work addresses the challenge that virtual robot designs often remain unrealizable due to the neglect of physical manufacturing constraints. To bridge this gap, we propose the first automated pipeline that translates abstract virtual designs into physically manufacturable blueprints, accepting sketch inputs from either human users or AI systems. Our approach integrates manufacturing constraint parsing, structure–function semantic embedding, and modular mapping to explicitly model critical hardware components—including actuators, electronics, batteries, and wiring—within the design process. By embedding real-world fabrication requirements directly into the generative workflow, this method substantially lowers the barrier to transforming conceptual robot designs into functional physical embodiments, thereby enhancing both the feasibility and development efficiency of novel robotic systems.
📝 Abstract
Over the past three decades, countless embodied yet virtual agents have freely evolved inside computer simulations, but vanishingly few were realized as physical robots. This is because evolution was conducted at a level of abstraction that was convenient for freeform body generation (creation, mutation, recombination) but swept away almost all of the physical details of functional body parts. The resulting designs were crude and underdetermined, requiring considerable effort and expertise to convert into a manufacturable format. Here, we automate this mapping from simplified design spaces that are readily evolvable to complete blueprints that can be directly followed by a builder. The pipeline incrementally resolves manufacturing constraints by embedding the structural and functional semantics of motors, electronics, batteries, and wiring into the abstract virtual design. In lieu of evolution, a user-defined or AI-generated ``sketch'' of a body plan can also be fed as input to the pipeline, providing a versatile framework for accelerating the design of novel robots.