TextResNet: Decoupling and Routing Optimization Signals in Compound AI Systems via Deep Residual Tuning

📅 2026-02-09
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing text-based gradient optimization methods struggle to propagate effective optimization signals through deep composite AI systems due to semantic entanglement and ambiguous attribution. To address this challenge, this work proposes a novel optimization framework that enables precise disentanglement and directed routing of optimization signals through four core components: additive semantic increments, semantic gradient decomposition, causal routing mechanisms, and density-aware scheduling. The proposed approach substantially enhances training stability and task performance, outperforming baseline methods such as TextGrad on complex agent-based tasks. Notably, it maintains robust performance even in scenarios where existing methods fail entirely, demonstrating its effectiveness in mitigating signal degradation in deeply integrated language-driven systems.

Technology Category

Application Category

📝 Abstract
Textual Gradient-style optimizers (TextGrad) enable gradient-like feedback propagation through compound AI systems. However, they do not work well for deep chains. The root cause of this limitation stems from the Semantic Entanglement problem in these extended workflows. In standard textual backpropagation, feedback signals mix local critiques with upstream contexts, leading to Attribution Ambiguity. To address this challenge, we propose TextResNet, a framework that reformulates the optimization process to achieve precise signal routing via four key innovations. Firstly, in the forward pass, it enforces Additive Semantic Deltas to preserve an Identity Highway for gradient flow. Secondly, in the backward pass, it introduces Semantic Gradient Decomposition via a Semantic Projector to disentangle feedback into causally independent subspaces. Thirdly, it implements Causal Routing, which routes projected signals to their specific components. Finally, it performs Density-Aware Optimization Scheduling to leverage the disentangled signals to dynamically allocate resources to key system bottlenecks. Our results show that TextResNet not only achieves superior performance compared to TextGrad, but also exhibits remarkable stability for agentic tasks in compound AI systems where baselines collapse. Code is available at https://github.com/JeanDiable/TextResNet.
Problem

Research questions and friction points this paper is trying to address.

Semantic Entanglement
Attribution Ambiguity
Textual Gradient
Compound AI Systems
Deep Chains
Innovation

Methods, ideas, or system contributions that make the work stand out.

Semantic Disentanglement
Causal Routing
Additive Semantic Deltas
Density-Aware Optimization
Textual Backpropagation
🔎 Similar Papers
No similar papers found.