Intention Chain-of-Thought Prompting with Dynamic Routing for Code Generation

📅 2025-12-15
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing chain-of-thought (CoT) prompting for code generation suffers from two key limitations: (1) reasoning redundancy—overly complex reasoning for simple tasks—and (2) inadequate intent modeling—neglecting algorithmic logic and computational complexity objectives. To address these, we propose RoutingGen, the first dynamic routing framework enabling *on-demand reasoning*: it adaptively selects between few-shot prompting and our novel Intent Chain-of-Thought (ICoT) based on task difficulty. ICoT explicitly models high-level intents—including core algorithms, time complexity constraints, and problem semantics—and incorporates a cognition-aware, dynamically difficulty-perceived routing mechanism grounded in cognitive economy principles. Evaluated across three large language models and six diverse benchmarks, RoutingGen achieves state-of-the-art performance while reducing average token consumption by 46.37%. Notably, ICoT significantly outperforms six mainstream prompting methods on high-difficulty tasks.

Technology Category

Application Category

📝 Abstract
Large language models (LLMs) exhibit strong generative capabilities and have shown great potential in code generation. Existing chain-of-thought (CoT) prompting methods enhance model reasoning by eliciting intermediate steps, but suffer from two major limitations: First, their uniform application tends to induce overthinking on simple tasks. Second, they lack intention abstraction in code generation, such as explicitly modeling core algorithmic design and efficiency, leading models to focus on surface-level structures while neglecting the global problem objective. Inspired by the cognitive economy principle of engaging structured reasoning only when necessary to conserve cognitive resources, we propose RoutingGen, a novel difficulty-aware routing framework that dynamically adapts prompting strategies for code generation. For simple tasks, it adopts few-shot prompting; for more complex ones, it invokes a structured reasoning strategy, termed Intention Chain-of-Thought (ICoT), which we introduce to guide the model in capturing task intention, such as the core algorithmic logic and its time complexity. Experiments across three models and six standard code generation benchmarks show that RoutingGen achieves state-of-the-art performance in most settings, while reducing total token usage by 46.37% on average across settings. Furthermore, ICoT outperforms six existing prompting baselines on challenging benchmarks.
Problem

Research questions and friction points this paper is trying to address.

Dynamically adapts prompting strategies for code generation
Reduces overthinking on simple tasks via difficulty-aware routing
Captures task intention like algorithmic logic and efficiency
Innovation

Methods, ideas, or system contributions that make the work stand out.

Dynamic routing framework adapts prompting strategies based on task difficulty
Intention Chain-of-Thought guides model to capture core algorithmic logic
Reduces token usage while achieving state-of-the-art performance
🔎 Similar Papers
No similar papers found.
S
Shen Li
Chongqing University
L
Li Huang
Chongqing University
Shaoxiong Zhan
Shaoxiong Zhan
Tsinghua University
Natural Language ProcessingLarge Language Model
W
Weifeng Sun
Chongqing University
Tao Yin
Tao Yin
Chongqing University
Zhongxin Liu
Zhongxin Liu
Zhejiang University
Software EngineeringLarge Language Models
M
Meng Yan
Chongqing University