🤖 AI Summary
To address catastrophic forgetting of prior knowledge in neural ordinary differential equations (ODEs) during incremental learning, this paper proposes a forgetting-free tuning (TwF) mechanism that rigorously preserves endpoint mappings. Methodologically, the control function is modeled in a Banach space; under nonsingularity conditions, we prove that the set of parameters preserving historical mappings forms a finite-codimensional Banach submanifold, whose tangent space corresponds to an invariant subspace of control functions—establishing that TwF is equivalent to control continuation within this tangent space. Theoretically, this work provides the first rigorous mathematical foundation for exact mapping preservation, integrating differential geometry and optimal control theory; it transcends conventional first-order approximations and guarantees precise retention of endpoint mappings for previously learned samples under parameter updates.
📝 Abstract
In our earlier work, we introduced the principle of Tuning without Forgetting (TwF) for sequential training of neural ODEs, where training samples are added iteratively and parameters are updated within the subspace of control functions that preserves the end-point mapping at previously learned samples on the manifold of output labels in the first-order approximation sense. In this letter, we prove that this parameter subspace forms a Banach submanifold of finite codimension under nonsingular controls, and we characterize its tangent space. This reveals that TwF corresponds to a continuation/deformation of the control function along the tangent space of this Banach submanifold, providing a theoretical foundation for its mapping-preserving (not forgetting) during the sequential training exactly, beyond first-order approximation.