Aggregation-aware MLP: An Unsupervised Approach for Graph Message-passing

πŸ“… 2025-07-27
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
Graph neural networks (GNNs) commonly rely on fixed aggregation functions (e.g., Mean/Sum/Max), leading to poor generalization on heterophilous graphs; existing adaptive approaches heavily depend on labeled data. To address this, we propose the unsupervised Aggregation-aware Multi-Layer Perceptron (AMLP)β€”the first method to embed aggregation pattern modeling into a lightweight MLP framework without requiring labels. AMLP adaptively captures high-order structural dependencies and heterophily characteristics via graph reconstruction for higher-order component modeling, while employing a single-layer encoder to explicitly represent varying degrees of heterophily. Extensive experiments demonstrate that AMLP significantly outperforms state-of-the-art GNNs and unsupervised baselines on node clustering and classification tasks across diverse heterophilous graphs. It exhibits strong robustness and generalization, eliminating dual reliance on handcrafted aggregation functions and labeled supervision.

Technology Category

Application Category

πŸ“ Abstract
Graph Neural Networks (GNNs) have become a dominant approach to learning graph representations, primarily because of their message-passing mechanisms. However, GNNs typically adopt a fixed aggregator function such as Mean, Max, or Sum without principled reasoning behind the selection. This rigidity, especially in the presence of heterophily, often leads to poor, problem dependent performance. Although some attempts address this by designing more sophisticated aggregation functions, these methods tend to rely heavily on labeled data, which is often scarce in real-world tasks. In this work, we propose a novel unsupervised framework, "Aggregation-aware Multilayer Perceptron" (AMLP), which shifts the paradigm from directly crafting aggregation functions to making MLP adaptive to aggregation. Our lightweight approach consists of two key steps: First, we utilize a graph reconstruction method that facilitates high-order grouping effects, and second, we employ a single-layer network to encode varying degrees of heterophily, thereby improving the capacity and applicability of the model. Extensive experiments on node clustering and classification demonstrate the superior performance of AMLP, highlighting its potential for diverse graph learning scenarios.
Problem

Research questions and friction points this paper is trying to address.

GNNs use fixed aggregators without principled selection reasoning
Existing methods rely heavily on scarce labeled data
Propose unsupervised AMLP to adapt MLP to aggregation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Unsupervised MLP adapts to graph aggregation
Graph reconstruction enables high-order grouping
Single-layer network encodes heterophily levels
πŸ”Ž Similar Papers
No similar papers found.
Xuanting Xie
Xuanting Xie
University of Electronic Science and Technology of China
Graph Neural NetworksClustering
B
Bingheng Li
Michigan State University
E
Erlin Pan
Alibaba Group
Z
Zhao Kang
School of Computer Science and Engineering, University of Electronic Science and Technology of China, Chengdu, China
Wenyu Chen
Wenyu Chen
Massachusetts Institute of Technology
optimizationstatistical learning