LogicMP: A Neuro-symbolic Approach for Encoding First-order Logic Constraints

📅 2023-09-27
🏛️ International Conference on Learning Representations
📈 Citations: 2
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of effectively integrating first-order logic constraints (FOLCs) into neural networks. We propose LogicMP, a plug-and-play neural layer that for the first time embeds Markov logic network (MLN) structure and logical symmetry into a differentiable neural architecture. Leveraging structure-aware mean-field variational inference, LogicMP transforms sequential symbolic reasoning into parallel tensor operations. The method achieves a favorable balance among expressivity, modularity, and training efficiency. Empirically, it substantially outperforms state-of-the-art baselines across graph, image, and text domains: inference speed increases by multiple-fold while accuracy simultaneously improves. LogicMP establishes a new, efficient, and general paradigm for encoding logical constraints in neuro-symbolic integration.
📝 Abstract
Integrating first-order logic constraints (FOLCs) with neural networks is a crucial but challenging problem since it involves modeling intricate correlations to satisfy the constraints. This paper proposes a novel neural layer, LogicMP, whose layers perform mean-field variational inference over an MLN. It can be plugged into any off-the-shelf neural network to encode FOLCs while retaining modularity and efficiency. By exploiting the structure and symmetries in MLNs, we theoretically demonstrate that our well-designed, efficient mean-field iterations effectively mitigate the difficulty of MLN inference, reducing the inference from sequential calculation to a series of parallel tensor operations. Empirical results in three kinds of tasks over graphs, images, and text show that LogicMP outperforms advanced competitors in both performance and efficiency.
Problem

Research questions and friction points this paper is trying to address.

Integrating first-order logic constraints with neural networks
Encoding FOLCs efficiently while retaining modularity
Mitigating MLN inference difficulty through parallel tensor operations
Innovation

Methods, ideas, or system contributions that make the work stand out.

LogicMP layer performs mean-field variational inference
Encodes first-order logic constraints into neural networks
Reduces inference to parallel tensor operations efficiently
🔎 Similar Papers
No similar papers found.
Weidi Xu
Weidi Xu
Infly Technology
J
Jingwei Wang
Ant Group
Lele Xie
Lele Xie
south china university of technology
Multimodal Large Language Modelcomputer visionOCRobject detectiondeep learning
J
Jianshan He
Ant Group
H
Hongting Zhou
Ant Group
Taifeng Wang
Taifeng Wang
Principle Researcher, Bytedance
graph learninglarge scale pretrain language modeldrug design and target discoverysearch and
X
Xiaopei Wan
Ant Group
J
Jingdong Chen
Ant Group
C
Chao Qu
INFLY TECH (Shanghai) Co., Ltd
W
Wei Chu
Ant Group