🤖 AI Summary
To address Euler angle singularities and the computational complexity of Lie group optimization in rotational modeling, this paper proposes the Quaternion Supervised Hopfield Neural Network (QSHNN): a continuous-time, fully connected neural architecture operating directly in the quaternion domain to ensure asymptotic stability of attitude learning. We introduce a novel periodic least-squares projection-based supervised learning strategy that enforces real-time structural constraints on weight matrix blocks to respect quaternion algebra—establishing, for the first time, an asymptotic stability theory for fixed points within a noncommutative algebraic framework. By integrating quaternion-valued gradient descent with structured weight constraints, QSHNN guarantees training convergence, algebraic consistency, and trajectory smoothness. Experiments demonstrate that QSHNN achieves high accuracy, rapid convergence, and strong robustness on random attitude estimation tasks, with bounded trajectory curvature—significantly outperforming real-valued baseline methods in critical applications such as robotic manipulator control and motion planning.
📝 Abstract
Motivated by the geometric advantages of quaternions in representing rotations and postures, we propose a quaternion-valued supervised learning Hopfield-structured neural network (QSHNN) with a fully connected structure inspired by the classic Hopfield neural network (HNN). Starting from a continuous-time dynamical model of HNNs, we extend the formulation to the quaternionic domain and establish the existence and uniqueness of fixed points with asymptotic stability. For the learning rules, we introduce a periodic projection strategy that modifies standard gradient descent by periodically projecting each 4*4 block of the weight matrix onto the closest quaternionic structure in the least-squares sense. This approach preserves both convergence and quaternionic consistency throughout training. Benefiting from this rigorous mathematical foundation, the experimental model implementation achieves high accuracy, fast convergence, and strong reliability across randomly generated target sets. Moreover, the evolution trajectories of the QSHNN exhibit well-bounded curvature, i.e., sufficient smoothness, which is crucial for applications such as control systems or path planning modules in robotic arms, where joint postures are parameterized by quaternion neurons. Beyond these application scenarios, the proposed model offers a practical implementation framework and a general mathematical methodology for designing neural networks under hypercomplex or non-commutative algebraic structures.