SONGs: Self-Organizing Neural Graphs

📅 2021-07-28
🏛️ IEEE Workshop/Winter Conference on Applications of Computer Vision
📈 Citations: 1
Influential: 0
📄 PDF
🤖 AI Summary
Decision trees offer strong interpretability but lack node reuse, whereas decision diagrams enable node sharing yet suffer from the absence of differentiable training mechanisms, hindering end-to-end optimization. To address this, we propose Self-Organizing Neural Graphs (SONG), the first differentiable decision diagram framework that incorporates Markov processes into decision diagram modeling, enabling gradient-based backpropagation. SONG unifies neural networks with symbolic graph structures, supporting node sharing, hierarchical class representations, and joint neural-symbolic optimization. Evaluated on Letter, Connect4, MNIST, CIFAR-10/100, and TinyImageNet, SONG achieves accuracy comparable to or exceeding state-of-the-art interpretable models—while preserving high interpretability and structural compactness. This work establishes a novel paradigm for differentiable symbolic learning.
📝 Abstract
Recent years have seen a surge in research on combining deep neural networks with other methods, including decision trees and graphs. There are at least three advantages of incorporating decision trees and graphs: they are easy to interpret since they are based on sequential decisions, they can make decisions faster, and they provide a hierarchy of classes. However, one of the well-known drawbacks of decision trees, as compared to decision graphs, is that decision trees cannot reuse the decision nodes. Nevertheless, decision graphs were not commonly used in deep learning due to the lack of efficient gradient-based training techniques. In this paper, we fill this gap and provide a general paradigm based on Markov processes, which allows for efficient training of the special type of decision graphs, which we call Self-Organizing Neural Graphs (SONG). We provide a theoretical study on SONG, complemented by experiments conducted on Letter, Connect4, MNIST, CIFAR, and TinyImageNet datasets, showing that our method performs on par or better than existing decision models.
Problem

Research questions and friction points this paper is trying to address.

Develops gradient-based training for decision graphs in deep learning
Addresses lack of node reuse in decision trees via neural graphs
Proposes Self-Organizing Neural Graphs (SONG) for interpretable deep networks
Innovation

Methods, ideas, or system contributions that make the work stand out.

Self-Organizing Neural Graphs for decision reuse
Markov process-based efficient gradient training
Outperforms existing decision models on datasets
🔎 Similar Papers
No similar papers found.