Random Search Neural Networks for Efficient and Expressive Graph Learning

📅 2025-10-26
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Random walk neural networks (RWNNs) suffer from incomplete node/edge coverage and weak global structural modeling on small or sparse graphs. To address this, we propose Random Search Neural Networks (RSNNs), which introduce a theoretically grounded random search mechanism guaranteeing full node and edge coverage in only $O(log|V|)$ searches—significantly reducing sampling complexity. RSNNs construct a sequence encoder from the generated graph sequences and enforce graph isomorphism invariance via probability-invariant design, ensuring both isomorphism invariance and universal approximation capability under general sequence models. On molecular and protein benchmark tasks, RSNNs achieve performance comparable to or better than RWNNs while requiring up to 16× fewer sampled sequences, demonstrating superior efficiency and expressive power.

Technology Category

Application Category

📝 Abstract
Random walk neural networks (RWNNs) have emerged as a promising approach for graph representation learning, leveraging recent advances in sequence models to process random walks. However, under realistic sampling constraints, RWNNs often fail to capture global structure even in small graphs due to incomplete node and edge coverage, limiting their expressivity. To address this, we propose extit{random search neural networks} (RSNNs), which operate on random searches, each of which guarantees full node coverage. Theoretically, we demonstrate that in sparse graphs, only $O(log |V|)$ searches are needed to achieve full edge coverage, substantially reducing sampling complexity compared to the $O(|V|)$ walks required by RWNNs (assuming walk lengths scale with graph size). Furthermore, when paired with universal sequence models, RSNNs are universal approximators. We lastly show RSNNs are probabilistically invariant to graph isomorphisms, ensuring their expectation is an isomorphism-invariant graph function. Empirically, RSNNs consistently outperform RWNNs on molecular and protein benchmarks, achieving comparable or superior performance with up to 16$ imes$ fewer sampled sequences. Our work bridges theoretical and practical advances in random walk based approaches, offering an efficient and expressive framework for learning on sparse graphs.
Problem

Research questions and friction points this paper is trying to address.

Improving graph learning by ensuring full node coverage
Reducing sampling complexity for sparse graph edge coverage
Achieving universal approximation with isomorphism-invariant graph functions
Innovation

Methods, ideas, or system contributions that make the work stand out.

RSNNs use random searches for full node coverage
RSNNs achieve full edge coverage with logarithmic complexity
RSNNs paired with universal sequence models are universal approximators
🔎 Similar Papers
No similar papers found.