🤖 AI Summary
To address severe catastrophic forgetting in continual knowledge graph embedding (CKGE), this paper proposes BAKE, a Bayesian-guided continual embedding framework. BAKE introduces Bayesian posterior updating into CKGE for the first time, enabling order-agnostic parameter evolution; it employs variational inference to model embedding uncertainty and incorporates a semantic consistency–driven continual clustering constraint to explicitly regulate knowledge evolution discrepancies. Extensive experiments on multiple real-world dynamic knowledge graphs demonstrate that BAKE significantly outperforms existing state-of-the-art methods in both historical knowledge retention and novel fact prediction. By effectively mitigating catastrophic forgetting, BAKE achieves superior overall generalization performance.
📝 Abstract
Since knowledge graphs (KG) will continue to evolve in real scenarios, traditional KGE models are only suitable for static knowledge graphs. Therefore, continual knowledge graph embedding (CKGE) has attracted the attention of researchers. Currently, a key challenge facing CKGE is that the model is prone to "catastrophic forgetting", resulting in the loss of previously learned knowledge. In order to effectively alleviate this problem, we propose a new CKGE model BAKE. First, we note that the Bayesian posterior update principle provides a natural continual learning strategy that is insensitive to data order and can theoretically effectively resist the forgetting of previous knowledge during data evolution. Different from the existing CKGE method, BAKE regards each batch of new data as a Bayesian update of the model prior. Under this framework, as long as the posterior distribution of the model is maintained, the model can better preserve the knowledge of early snapshots even after evolving through multiple time snapshots. Secondly, we propose a continual clustering method for CKGE, which further directly combats knowledge forgetting by constraining the evolution difference (or change amplitude) between new and old knowledge between different snapshots. We conduct extensive experiments on BAKE on multiple datasets, and the results show that BAKE significantly outperforms existing baseline models.