🤖 AI Summary
Under the vertex elimination model, this work establishes, for the first time, the NP-completeness of two fundamental graph problems in algorithmic differentiation: computing the structurally most elegant cumulative (i.e., minimizing the number of multiplications) and determining the minimum number of edges in the computational graph—resolving prior reliance on assumptions about local partial derivative algebraic properties. We introduce polynomial-time data reduction rules based on false twin vertices, significantly compressing input size. Furthermore, we design the first exact O*(2ⁿ)-time algorithm and prove its near-optimality under the Exponential Time Hypothesis (ETH). These results fully settle a long-standing open problem in the field. The proposed reduction rules are directly applicable to preprocessing modules in practical automatic differentiation (AD) tools.
📝 Abstract
We study graph-theoretic formulations of two fundamental problems in algorithmic differentiation. The first (Structural Optimal Jacobian Accumulation) is that of computing a Jacobian while minimizing multiplications. The second (Minimum Edge Count) is to find a minimum-size computational graph. For both problems, we consider the vertex elimination operation. Our main contribution is to show that both problems are NP-complete, thus resolving longstanding open questions. In contrast to prior work, our reduction for Structural Optimal Jacobian Accumulation does not rely on any assumptions about the algebraic relationships between local partial derivatives; we allow these values to be mutually independent. We also provide $O^*(2^n)$-time exact algorithms for both problems, and show that under the exponential time hypothesis these running times are essentially tight. Finally, we provide a data reduction rule for Structural Optimal Jacobian Accumulation by showing that false twins may always be eliminated consecutively.