Asymptotically Stable Quaternion-valued Hopfield-structured Neural Network with Periodic Projection-based Supervised Learning Rules

📅 2025-10-18
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address Euler angle singularities and the computational complexity of Lie group optimization in rotational modeling, this paper proposes the Quaternion Supervised Hopfield Neural Network (QSHNN): a continuous-time, fully connected neural architecture operating directly in the quaternion domain to ensure asymptotic stability of attitude learning. We introduce a novel periodic least-squares projection-based supervised learning strategy that enforces real-time structural constraints on weight matrix blocks to respect quaternion algebra—establishing, for the first time, an asymptotic stability theory for fixed points within a noncommutative algebraic framework. By integrating quaternion-valued gradient descent with structured weight constraints, QSHNN guarantees training convergence, algebraic consistency, and trajectory smoothness. Experiments demonstrate that QSHNN achieves high accuracy, rapid convergence, and strong robustness on random attitude estimation tasks, with bounded trajectory curvature—significantly outperforming real-valued baseline methods in critical applications such as robotic manipulator control and motion planning.

Technology Category

Application Category

📝 Abstract
Motivated by the geometric advantages of quaternions in representing rotations and postures, we propose a quaternion-valued supervised learning Hopfield-structured neural network (QSHNN) with a fully connected structure inspired by the classic Hopfield neural network (HNN). Starting from a continuous-time dynamical model of HNNs, we extend the formulation to the quaternionic domain and establish the existence and uniqueness of fixed points with asymptotic stability. For the learning rules, we introduce a periodic projection strategy that modifies standard gradient descent by periodically projecting each 4*4 block of the weight matrix onto the closest quaternionic structure in the least-squares sense. This approach preserves both convergence and quaternionic consistency throughout training. Benefiting from this rigorous mathematical foundation, the experimental model implementation achieves high accuracy, fast convergence, and strong reliability across randomly generated target sets. Moreover, the evolution trajectories of the QSHNN exhibit well-bounded curvature, i.e., sufficient smoothness, which is crucial for applications such as control systems or path planning modules in robotic arms, where joint postures are parameterized by quaternion neurons. Beyond these application scenarios, the proposed model offers a practical implementation framework and a general mathematical methodology for designing neural networks under hypercomplex or non-commutative algebraic structures.
Problem

Research questions and friction points this paper is trying to address.

Extends Hopfield neural networks to quaternion domain for rotation representation
Develops periodic projection learning rules for quaternionic weight matrix consistency
Ensures asymptotic stability and smooth trajectories for robotic applications
Innovation

Methods, ideas, or system contributions that make the work stand out.

Quaternion-valued Hopfield neural network with supervised learning
Periodic projection strategy for quaternionic weight structure
Asymptotic stability and smooth trajectories for robotic control
🔎 Similar Papers
2024-04-05arXiv.orgCitations: 16