π€ AI Summary
This work addresses the lack of a unified framework and the challenges in end-to-end prototyping within conversational search research by proposing and open-sourcing a full-stack platform. Built upon a modular architecture, the platform integrates core components such as query rewriting, ranking, and response generation, and introduces an innovative single-file node design that supports dual-mode execution, secure credential management, runtime telemetry, and AI-assisted coding. It provides over fifty ready-to-use modules, substantially lowering the barrier to development. Empirical case studies demonstrate the platformβs advantages in system construction, reusability, and deployment efficiency.
π Abstract
Conversational search (CS) requires a complex software engineering pipeline that integrates query reformulation, ranking, and response generation. CS researchers currently face two barriers: the lack of a unified framework for efficiently sharing contributions with the community, and the difficulty of deploying end-to-end prototypes needed for user evaluation. We introduce Orcheo, an open-source platform designed to bridge this gap. Orcheo offers three key advantages: (i) A modular architecture promotes component reuse through single-file node modules, facilitating sharing and reproducibility in CS research; (ii) Production-ready infrastructure bridges the prototype-to-system gap via dual execution modes, secure credential management, and execution telemetry, with built-in AI coding support that lowers the learning curve; (iii) Starter-kit assets include 50+ off-the-shelf components for query understanding, ranking, and response generation, enabling the rapid bootstrapping of complete CS pipelines. We describe the framework architecture and validate Orcheo's utility through case studies that highlight modularity and ease of use. Orcheo is released as open source under the MIT License at https://github.com/ShaojieJiang/orcheo.