🤖 AI Summary
This paper addresses the fundamental problem of how consciousness mediates information transfer from unconscious to conscious memory. Methodologically, it introduces the first category-theoretic framework for modeling consciousness: a *Consciousness Functor* (CF) formalized within a topos, capturing consciousness as a “content selection and amplification” mechanism. Unconscious processes are modeled via coalgebras grounded in Global Workspace Theory; multimodal internal language (MUMBLE) is integrated with Universal Reinforcement Learning (URL) and network-economic resource allocation to enable bidirectional, resource-constrained dynamics between short-term and long-term memory. The primary contribution is the first rigorous functorial characterization of consciousness as an information-integration operator, yielding a computationally tractable and empirically verifiable mathematical framework for conscious–unconscious interaction—thereby advancing cognitive modeling with both formal rigor and neurobiological plausibility.
📝 Abstract
We propose a novel theory of consciousness as a functor (CF) that receives and transmits contents from unconscious memory into conscious memory. Our CF framework can be seen as a categorial formulation of the Global Workspace Theory proposed by Baars. CF models the ensemble of unconscious processes as a topos category of coalgebras. The internal language of thought in CF is defined as a Multi-modal Universal Mitchell-Benabou Language Embedding (MUMBLE). We model the transmission of information from conscious short-term working memory to long-term unconscious memory using our recently proposed Universal Reinforcement Learning (URL) framework. To model the transmission of information from unconscious long-term memory into resource-constrained short-term memory, we propose a network economic model.