๐ค AI Summary
Existing molecular representation methods are often confined to a single scale and overly reliant on local message passing, making it difficult to capture cross-scale patterns and long-range dependencies. This work proposes a unified self-supervised learning framework that generates chemically informed substructural tokens via graph Byte Pair Encoding (BPE) and integrates atomic- and fragment-level information through a parallel GNNโTransformer architecture. The approach introduces a novel dual-scale feature interaction mechanism that simultaneously models local atomic environments, substructural motifs, and long-range dependencies, substantially enhancing both representational capacity and interpretability. Evaluated on MoleculeNet, PharmaBench, and LRGB benchmarks, the model achieves state-of-the-art performance across multiple classification and regression tasks and effectively identifies chemically meaningful functional groups.
๐ Abstract
Graph Transformers have recently attracted attention for molecular property prediction by combining the inductive biases of graph neural networks (GNNs) with the global receptive field of Transformers. However, many existing hybrid architectures remain GNN-dominated, causing the resulting representations to remain heavily shaped by local message passing. Moreover, most existing methods operate at only a single structural granularity, limiting their ability to capture molecular patterns that span multiple molecular scales. We introduce BiScale-GTR, a unified framework for self-supervised molecular representation learning that combines chemically grounded fragment tokenization with adaptive multi-scale reasoning. Our method improves graph Byte Pair Encoding (BPE) tokenization to produce consistent, chemically valid, and high-coverage fragment tokens, which are used as fragment-level inputs to a parallel GNN-Transformer architecture. Architecturally, atom-level representations learned by a GNN are pooled into fragment-level embeddings and fused with fragment token embeddings before Transformer reasoning, enabling the model to jointly capture local chemical environments, substructure-level motifs, and long-range molecular dependencies. Experiments on MoleculeNet, PharmaBench, and the Long Range Graph Benchmark (LRGB) demonstrate state-of-the-art performance across both classification and regression tasks. Attribution analysis further shows that BiScale-GTR highlights chemically meaningful functional motifs, providing interpretable links between molecular structure and predicted properties. Code will be released upon acceptance.