Scene-Adaptive Motion Planning with Explicit Mixture of Experts and Interaction-Oriented Optimization

📅 2025-05-18
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Autonomous trajectory planning in complex urban environments faces key challenges: difficulty in modeling multimodal behavior, poor generalization of single-expert models, and insufficient modeling of vehicle–environment interactions. To address these, this paper proposes an Explicit Mixture-of-Experts (EMoE) dynamic routing framework. Its core contributions are: (1) the first scene-aware explicit MoE architecture, employing a learnable router for task-adaptive expert selection; (2) a multimodal prior query mechanism to enhance diversity-aware trajectory modeling; and (3) an interaction-aware graph neural network coupled with a co-optimization loss function to explicitly capture bidirectional influences between the ego-vehicle and dynamic environmental agents. Evaluated on the NuPlan benchmark, our method achieves state-of-the-art performance across all major test scenarios, significantly improving planning success rate, ride comfort, and safety.

Technology Category

Application Category

📝 Abstract
Despite over a decade of development, autonomous driving trajectory planning in complex urban environments continues to encounter significant challenges. These challenges include the difficulty in accommodating the multi-modal nature of trajectories, the limitations of single expert in managing diverse scenarios, and insufficient consideration of environmental interactions. To address these issues, this paper introduces the EMoE-Planner, which incorporates three innovative approaches. Firstly, the Explicit MoE (Mixture of Experts) dynamically selects specialized experts based on scenario-specific information through a shared scene router. Secondly, the planner utilizes scene-specific queries to provide multi-modal priors, directing the model's focus towards relevant target areas. Lastly, it enhances the prediction model and loss calculation by considering the interactions between the ego vehicle and other agents, thereby significantly boosting planning performance. Comparative experiments were conducted using the Nuplan dataset against the state-of-the-art methods. The simulation results demonstrate that our model consistently outperforms SOTA models across nearly all test scenarios.
Problem

Research questions and friction points this paper is trying to address.

Accommodating multi-modal nature of autonomous driving trajectories
Overcoming single expert limitations in diverse scenarios
Enhancing environmental interaction considerations in planning
Innovation

Methods, ideas, or system contributions that make the work stand out.

Explicit MoE dynamically selects scenario-specific experts
Scene-specific queries provide multi-modal priors
Enhanced prediction model considers ego-agent interactions
🔎 Similar Papers
No similar papers found.
Hongbiao Zhu
Hongbiao Zhu
BYD Company Limited & Harbin Institute of Technology & Carnegie Mellon University
ExploratioNavigationAutonomous Driving
L
Liulong Ma
Automotive New Technology Research Institute, BYD Company Limited
X
Xian Wu
Automotive New Technology Research Institute, BYD Company Limited
X
Xin Deng
Automotive New Technology Research Institute, BYD Company Limited
Xiaoyao Liang
Xiaoyao Liang
Shanghai Jiao Tong University
Computer Architecture