ActPC-Geom: Towards Scalable Online Neural-Symbolic Learning via Accelerating Active Predictive Coding with Information Geometry&Diverse Cognitive Mechanisms

📅 2025-01-08
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study addresses the challenge of simultaneously achieving real-time inference, hybrid data processing (continuous/discrete, symbolic/subsymbolic), and cognitive interpretability in neural networks. Methodologically, it introduces the first online neurosymbolic learning framework grounded in the Wasserstein metric: replacing conventional KL-divergence–driven predictive coding with Wasserstein gradient flows; integrating differentiable kernel PCA, hypervector algebra, fuzzy formal concept analysis, and Galois connection optimization for cognitive mechanism modeling; and unifying Hopfield associative memory with Transformer-based compositional reasoning. Contributions include millisecond-scale online weight updates and few-shot adaptability; empirical validation of symbolic–subsymbolic joint reasoning on commonsense reasoning and algorithmic chemical evolution tasks; and seamless interoperability with heterogeneous systems such as OpenCog Hyperon. The framework establishes a novel paradigm for scalable, interpretable, and real-time artificial general intelligence.

Technology Category

Application Category

📝 Abstract
This paper introduces ActPC-Geom, an approach to accelerate Active Predictive Coding (ActPC) in neural networks by integrating information geometry, specifically using Wasserstein-metric-based methods for measure-dependent gradient flows. We propose replacing KL-divergence in ActPC's predictive error assessment with the Wasserstein metric, suggesting this may enhance network robustness. To make this computationally feasible, we present strategies including: (1) neural approximators for inverse measure-dependent Laplacians, (2) approximate kernel PCA embeddings for low-rank approximations feeding into these approximators, and (3) compositional hypervector embeddings derived from kPCA outputs, with algebra optimized for fuzzy FCA lattices learned through neural architectures analyzing network states. This results in an ActPC architecture capable of real-time online learning and integrating continuous (e.g., transformer-like or Hopfield-net-like) and discrete symbolic ActPC networks, including frameworks like OpenCog Hyperon or ActPC-Chem for algorithmic chemistry evolution. Shared probabilistic, concept-lattice, and hypervector models enable symbolic-subsymbolic integration. Key features include (1) compositional reasoning via hypervector embeddings in transformer-like architectures for tasks like commonsense reasoning, and (2) Hopfield-net dynamics enabling associative long-term memory and attractor-driven cognitive features. We outline how ActPC-Geom combines few-shot learning with online weight updates, enabling deliberative thinking and seamless symbolic-subsymbolic reasoning. Ideas from Galois connections are explored for efficient hybrid ActPC/ActPC-Chem processing. Finally, we propose a specialized HPC design optimized for real-time focused attention and deliberative reasoning tailored to ActPC-Geom's demands.
Problem

Research questions and friction points this paper is trying to address.

Neural Networks
Continuous and Discrete Information
Real-time Learning
Innovation

Methods, ideas, or system contributions that make the work stand out.

Wasserstein Metric
Symbolic and Non-Symbolic Processing
Transformers for Common Sense Reasoning
🔎 Similar Papers
No similar papers found.