🤖 AI Summary
This work addresses the challenge of simultaneously aligning visual outputs and underlying behavioral rules with user intent in artificial life systems. Methodologically, it introduces a natural language–guided evolutionary framework featuring a semantic feedback loop: a prompt-to-parameter encoder maps natural language prompts to evolvable parameters; these are optimized via the CMA-ES algorithm and evaluated using CLIP-based semantic scoring within an interactive multi-agent simulation environment. Its key contribution is the first use of natural language as an evolutionary control medium, enabling prompt-driven automatic synthesis of behavioral rules and collaborative generative design. User studies demonstrate that the system significantly improves semantic consistency of generated outputs compared to manual parameter tuning (p < 0.01), validating its effectiveness and practical potential for human–AI co-evolutionary design.
📝 Abstract
We present a semantic feedback framework that enables natural language to guide the evolution of artificial life systems. Integrating a prompt-to-parameter encoder, a CMA-ES optimizer, and CLIP-based evaluation, the system allows user intent to modulate both visual outcomes and underlying behavioral rules. Implemented in an interactive ecosystem simulation, the framework supports prompt refinement, multi-agent interaction, and emergent rule synthesis. User studies show improved semantic alignment over manual tuning and demonstrate the system's potential as a platform for participatory generative design and open-ended evolution.