GeneZip: Region-Aware Compression for Long Context DNA Modeling

📅 2026-02-19
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Genomic sequences span billions of base pairs, posing substantial computational challenges for long-context modeling. This work proposes a region-aware compression mechanism that, for the first time, integrates dynamic routing with self-supervised DNA language modeling. Leveraging the biological prior that genomic information is unevenly distributed, the method adaptively allocates representational resources between coding and non-coding regions. It achieves a 137.6× sequence compression ratio while incurring negligible performance loss in language modeling—increasing perplexity by only 0.31. The approach enables training of a 636M-parameter model on a single A100 80GB GPU with a 1M-bp context window, achieving comparable or superior performance across multiple downstream tasks.

Technology Category

Application Category

📝 Abstract
Genomic sequences span billions of base pairs (bp), posing a fundamental challenge for genome-scale foundation models. Existing approaches largely sidestep this barrier by either scaling relatively small models to long contexts or relying on heavy multi-GPU parallelism. Here we introduce GeneZip, a DNA compression model that leverages a key biological prior: genomic information is highly imbalanced. Coding regions comprise only a small fraction (about 2 percent) yet are information-dense, whereas most non-coding sequence is comparatively information-sparse. GeneZip couples HNet-style dynamic routing with a region-aware compression-ratio objective, enabling adaptive allocation of representation budget across genomic regions. As a result, GeneZip learns region-aware compression and achieves 137.6x compression with only 0.31 perplexity increase. On downstream long-context benchmarks, GeneZip achieves comparable or better performance on contact map prediction, expression quantitative trait loci prediction, and enhancer-target gene prediction. By reducing effective sequence length, GeneZip unlocks simultaneous scaling of context and capacity: compared to the prior state-of-the-art model JanusDNA, it enables training models 82.6x larger at 1M-bp context, supporting a 636M-parameter GeneZip model at 1M-bp context. All experiments in this paper can be trained on a single A100 80GB GPU.
Problem

Research questions and friction points this paper is trying to address.

DNA compression
long context modeling
genomic sequences
foundation models
region-aware compression
Innovation

Methods, ideas, or system contributions that make the work stand out.

region-aware compression
genomic foundation models
dynamic routing
long-context DNA modeling
information-dense coding regions
🔎 Similar Papers
No similar papers found.