Graph Distribution-valued Signals: A Wasserstein Space Perspective

📅 2025-09-30
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Traditional graph signal processing (GSP) relies on synchronous observations, deterministic modeling, and strict node-wise correspondences—strong assumptions often violated in real-world uncertain graph data. To address these limitations, we propose Graph Distribution-Valued Signals (GDSs), a novel framework that models node signals as probability distributions in the Wasserstein space, thereby generalizing GSP from vector-valued to distribution-valued signals. This paradigm shift unifies and extends classical GSP: deterministic, synchronous signals emerge as special cases; node-level uncertainty and stochasticity are inherently captured; and alignment-free, unpaired distribution filtering becomes feasible. Leveraging optimal transport theory and Wasserstein geometry, we establish a systematic theoretical foundation for distribution-domain graph filtering, representation learning, and inference. Empirical evaluation on prediction tasks demonstrates substantial improvements in filter learning performance and robust modeling of complex, uncertain graph signals.

Technology Category

Application Category

📝 Abstract
We introduce a novel framework for graph signal processing (GSP) that models signals as graph distribution-valued signals (GDSs), which are probability distributions in the Wasserstein space. This approach overcomes key limitations of classical vector-based GSP, including the assumption of synchronous observations over vertices, the inability to capture uncertainty, and the requirement for strict correspondence in graph filtering. By representing signals as distributions, GDSs naturally encode uncertainty and stochasticity, while strictly generalizing traditional graph signals. We establish a systematic dictionary mapping core GSP concepts to their GDS counterparts, demonstrating that classical definitions are recovered as special cases. The effectiveness of the framework is validated through graph filter learning for prediction tasks, supported by experimental results.
Problem

Research questions and friction points this paper is trying to address.

Modeling graph signals as distributions to encode uncertainty
Overcoming limitations of synchronous observations in graph processing
Generalizing graph filtering beyond strict vertex correspondence
Innovation

Methods, ideas, or system contributions that make the work stand out.

Models graph signals as probability distributions in Wasserstein space
Encodes uncertainty and stochasticity through distribution-valued signals
Establishes systematic mapping from classical to distribution-based GSP concepts
🔎 Similar Papers
No similar papers found.