Tensor Decomposition Meets Knowledge Compilation: A Study Comparing Tensor Trains with OBDDs

📅 2025-02-06
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Traditional knowledge compilation is largely restricted to subclasses of negation normal form (NNF), limiting expressive power and efficiency. Method: This work introduces tensor train (TT) decomposition—a compact low-rank tensor representation—as a novel, non-graph-based formalism for representing Boolean functions within knowledge compilation. We systematically integrate TT into the theoretical framework of knowledge compilation and establish rigorous correspondences between TT representations and NNF classes. Contribution/Results: We prove that TT representations are exponentially more succinct than ordered binary decision diagrams (OBDDs) while supporting polynomial-time model counting and entailment checking—two key tractable reasoning tasks. Empirical evaluation confirms that TT achieves an effective trade-off between representation compactness and inference efficiency. This work establishes TT as a theoretically grounded, computationally viable alternative to classical graph-based representations, thereby extending the scope of knowledge compilation beyond traditional NNF-based paradigms.

Technology Category

Application Category

📝 Abstract
A knowledge compilation map analyzes tractable operations in Boolean function representations and compares their succinctness. This enables the selection of appropriate representations for different applications. In the knowledge compilation map, all representation classes are subsets of the negation normal form (NNF). However, Boolean functions may be better expressed by a representation that is different from that of the NNF subsets. In this study, we treat tensor trains as Boolean function representations and analyze their succinctness and tractability. Our study is the first to evaluate the expressiveness of a tensor decomposition method using criteria from knowledge compilation literature. Our main results demonstrate that tensor trains are more succinct than ordered binary decision diagrams (OBDDs) and support the same polytime operations as OBDDs. Our study broadens their application by providing a theoretical link between tensor decomposition and existing NNF subsets.
Problem

Research questions and friction points this paper is trying to address.

Compare tensor trains with OBDDs
Analyze succinctness and tractability of tensor trains
Link tensor decomposition with NNF subsets
Innovation

Methods, ideas, or system contributions that make the work stand out.

Tensor trains as Boolean representations
Compare succinctness with OBDDs
Support same polytime operations
🔎 Similar Papers
No similar papers found.
R
Ryoma Onaka
NTT Communication Science Laboratories, NTT Corporation, Kyoto, Japan
K
Kengo Nakamura
NTT Communication Science Laboratories, NTT Corporation, Kyoto, Japan
Masaaki Nishino
Masaaki Nishino
NTT
AlgorithmNatural Language Processing
N
Norihito Yasuda
NTT Communication Science Laboratories, NTT Corporation, Kyoto, Japan