GRNFormer: A Biologically-Guided Framework for Integrating Gene Regulatory Networks into RNA Foundation Models

📅 2025-03-03
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Current scRNA-seq foundation models effectively capture gene expression patterns but neglect prior knowledge of gene regulation and fail to integrate complementary regulatory signals from multi-omics data. To address this, we propose a biology-guided, multi-scale gene regulatory network (GRN) fusion framework. Our method introduces: (i) a hierarchical GRN construction at both cell-type and single-cell resolutions; (ii) a structure-aware ensemble mechanism comprising a graph topology adapter and biological priors—driven edge perturbation based on co-expression; and (iii) end-to-end joint training of multi-omics-inferred GRNs, graph neural networks (GNNs), and RNA foundation models. Evaluated on drug response prediction, single-cell drug classification, and gene perturbation prediction, our approach achieves improvements of +3.6% in correlation, +9.6% in AUC, and +1.1% in accuracy, respectively—outperforming state-of-the-art methods across all tasks.

Technology Category

Application Category

📝 Abstract
Foundation models for single-cell RNA sequencing (scRNA-seq) have shown promising capabilities in capturing gene expression patterns. However, current approaches face critical limitations: they ignore biological prior knowledge encoded in gene regulatory relationships and fail to leverage multi-omics signals that could provide complementary regulatory insights. In this paper, we propose GRNFormer, a new framework that systematically integrates multi-scale Gene Regulatory Networks (GRNs) inferred from multi-omics data into RNA foundation model training. Our framework introduces two key innovations. First, we introduce a pipeline for constructing hierarchical GRNs that capture regulatory relationships at both cell-type-specific and cell-specific resolutions. Second, we design a structure-aware integration framework that addresses the information asymmetry in GRNs through two technical advances: (1) A graph topological adapter using multi-head cross-attention to weight regulatory relationships dynamically, and (2) a novel edge perturbation strategy that perturb GRNs with biologically-informed co-expression links to augment graph neural network training. Comprehensive experiments have been conducted on three representative downstream tasks across multiple model architectures to demonstrate the effectiveness of GRNFormer. It achieves consistent improvements over state-of-the-art (SoTA) baselines: $3.6%$ increase in drug response prediction correlation, $9.6%$ improvement in single-cell drug classification AUC, and $1.1%$ average gain in gene perturbation prediction accuracy.
Problem

Research questions and friction points this paper is trying to address.

Integrates gene regulatory networks into RNA models
Addresses limitations in current scRNA-seq foundation models
Improves drug response and gene perturbation predictions
Innovation

Methods, ideas, or system contributions that make the work stand out.

Integrates multi-scale Gene Regulatory Networks into RNA models
Uses graph topological adapter with multi-head cross-attention
Applies edge perturbation strategy for graph neural network training
🔎 Similar Papers
No similar papers found.