🤖 AI Summary
Graph classification for large-scale pathological graphs (e.g., tissue slide-derived graphs) suffers from high annotation costs due to the need for expensive subgraph-level labels.
Method: We propose a weakly supervised graph classification framework that operates solely on graph-level labels. It automatically identifies discriminative local subgraphs via a dual-path strategy combining sliding windows and breadth-first search (BFS), models them using a graph attention network (GAT), and employs an attention-driven importance scoring and weighted aggregation mechanism for label propagation and global prediction.
Contribution/Results: This is the first systematic application of weak supervision to large-scale graph classification, enabling key structural localization without any subgraph annotations. Experiments on multiple pathological graph datasets demonstrate a 3.2% accuracy gain over fully supervised baselines; remarkably, using only 1% of the labeling cost, our method achieves 92% of the fully supervised performance—substantially enhancing practicality and generalizability in clinical settings.
📝 Abstract
Graph classification plays a pivotal role in various domains, including pathology, where images can be represented as graphs.In this domain, images can be represented as graphs, where nodes might represent individual nuclei, and edges capture the spatial or functional relationships between them. Often, the overall label of the graph, such as a cancer type or disease state, is determined by patterns within smaller, localized regions of the image. This work introduces a weakly-supervised graph classification framework leveraging two subgraph extraction techniques: (1) Sliding-window approach (2) BFS-based approach. Subgraphs are processed using a Graph Attention Network (GAT), which employs attention mechanisms to identify the most informative subgraphs for classification. Weak supervision is achieved by propagating graph-level labels to subgraphs, eliminating the need for detailed subgraph annotations.