š¤ AI Summary
This work addresses the scalability limitations of existing Transformer-based methods for single-cell RNA sequencing (scRNA-seq) clustering, which suffer from O(n²) computational complexity and struggle with large-scale datasets. To overcome this challenge, the authors propose BGFormer, a novel model that introduces bipartite graph attention to scRNA-seq clustering for the first time. By incorporating learnable anchor tokens and constructing sparse attention between cells and anchors, BGFormer reduces computational complexity to linear order. The method further integrates graph attention mechanisms with embedding space optimization, achieving substantial improvements in both clustering performance and scalability across multiple large-scale scRNA-seq datasets.
š Abstract
scRNA-seq clustering is a critical task for analyzing single-cell RNA sequencing (scRNA-seq) data, as it groups cells with similar gene expression profiles. Transformers, as powerful foundational models, have been applied to scRNA-seq clustering. Their self-attention mechanism automatically assigns higher attention weights to cells within the same cluster, enhancing the distinction between clusters. Existing methods for scRNA-seq clustering, such as graph transformer-based models, treat each cell as a token in a sequence. Their computational and space complexities are $\mathcal{O}(n^2)$ with respect to the number of cells, limiting their applicability to large-scale scRNA-seq datasets.To address this challenge, we propose a Bipartite Graph Transformer-based clustering model (BGFormer) for scRNA-seq data. We introduce a set of learnable anchor tokens as shared reference points to represent the entire dataset. A bipartite graph attention mechanism is introduced to learn the similarity between cells and anchor tokens, bringing cells of the same class closer together in the embedding space. BGFormer achieves linear computational complexity with respect to the number of cells, making it scalable to large datasets. Experimental results on multiple large-scale scRNA-seq datasets demonstrate the effectiveness and scalability of BGFormer.