Pearl: A Foundation Model for Placing Every Atom in the Right Location

📅 2025-10-28
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Predicting the 3D structures of protein–ligand complexes remains a critical bottleneck in computational drug discovery. To address this, we introduce the first large-scale SO(3)-equivariant diffusion foundation model tailored for joint protein–ligand folding, designed to enhance both prediction accuracy and physical plausibility. Methodologically: (1) we construct a high-quality synthetic dataset to mitigate scarcity of experimentally resolved complex structures; (2) we develop an SO(3)-equivariant diffusion architecture that rigorously enforces rotational symmetry; and (3) we propose a multi-chain templated controllable inference mechanism enabling co-modeling of proteins and non-polymeric ligands. Experiments demonstrate substantial improvements over AlphaFold 3: +14.5% and +14.2% in the fraction of predictions achieving RMSD < 2 Å on Runs N’Poses and PoseBusters benchmarks, respectively, and a 3.6× gain under the stricter RMSD < 1 Å threshold. These advances significantly advance high-accuracy, physically interpretable drug discovery.

Technology Category

Application Category

📝 Abstract
Accurately predicting the three-dimensional structures of protein-ligand complexes remains a fundamental challenge in computational drug discovery that limits the pace and success of therapeutic design. Deep learning methods have recently shown strong potential as structural prediction tools, achieving promising accuracy across diverse biomolecular systems. However, their performance and utility are constrained by scarce experimental data, inefficient architectures, physically invalid poses, and the limited ability to exploit auxiliary information available at inference. To address these issues, we introduce Pearl (Placing Every Atom in the Right Location), a foundation model for protein-ligand cofolding at scale. Pearl addresses these challenges with three key innovations: (1) training recipes that include large-scale synthetic data to overcome data scarcity; (2) architectures that incorporate an SO(3)-equivariant diffusion module to inherently respect 3D rotational symmetries, improving generalization and sample efficiency, and (3) controllable inference, including a generalized multi-chain templating system supporting both protein and non-polymeric components as well as dual unconditional/conditional modes. Pearl establishes a new state-of-the-art performance in protein-ligand cofolding. On the key metric of generating accurate (RMSD<2 {A}) and physically valid poses, Pearl surpasses AlphaFold 3 and other open source baselines on the public Runs N'Poses and PoseBusters benchmarks, delivering 14.5% and 14.2% improvements, respectively, over the next best model. In the pocket-conditional cofolding regime, Pearl delivers $3.6 imes$ improvement on a proprietary set of challenging, real-world drug targets at the more rigorous RMSD<1 {A} threshold. Finally, we demonstrate that model performance correlates directly with synthetic dataset size used in training.
Problem

Research questions and friction points this paper is trying to address.

Predicting 3D protein-ligand complex structures accurately
Overcoming data scarcity and inefficient architectures in prediction
Generating physically valid poses for computational drug discovery
Innovation

Methods, ideas, or system contributions that make the work stand out.

Large-scale synthetic data training to overcome scarcity
SO(3)-equivariant diffusion module for 3D symmetries
Controllable inference with multi-chain templating system
🔎 Similar Papers
No similar papers found.
A
Alejandro Dobles
Genesis Molecular AI
N
Nina Jovic
Genesis Molecular AI
K
Kenneth Leidal
Genesis Molecular AI
P
Pranav M. Murugan
Genesis Molecular AI
D
David C. Williams
Genesis Molecular AI
D
Drausin Wulsin
Genesis Molecular AI
Nate Gruver
Nate Gruver
New York University
Deep LearningGenerative ModelsAI for Science
C
Christina X. Ji
Genesis Molecular AI
K
Korrawat Pruegsanusak
Genesis Molecular AI
G
Gianluca Scarpellini
Genesis Molecular AI
A
Ansh Sharma
Genesis Molecular AI
W
Wojciech Swiderski
Genesis Molecular AI
A
Andrea Bootsma
Genesis Molecular AI
R
Richard Strong Bowen
Genesis Molecular AI
C
Charlotte Chen
Genesis Molecular AI
J
Jamin Chen
Genesis Molecular AI
M
Marc Andr'e Damgen
Genesis Molecular AI
R
Roy Tal Dew
NVIDIA
B
Benjamin Difrancesco
Genesis Molecular AI
J
J. D. Fishman
Genesis Molecular AI
A
Alla Ivanova
Genesis Molecular AI
Z
Zach Kagin
Genesis Molecular AI
D
David Li-Bland
Genesis Molecular AI
Z
Zuli Liu
Genesis Molecular AI
I
Igor Morozov
Genesis Molecular AI
Jeffrey Ouyang-Zhang
Jeffrey Ouyang-Zhang
Genesis Molecular AI
F
Frank C. Pickard IV
Genesis Molecular AI
K
Kushal S. Shah
NVIDIA
B
Ben Shor
Genesis Molecular AI
G
Gabriel Monteiro da Silva
Genesis Molecular AI
M
Maxx H. Tessmer
Genesis Molecular AI
C
C. Tilbury
Genesis Molecular AI
C
Cyr Vetcher
Genesis Molecular AI
D
Daniel Zeng
Genesis Molecular AI
Maruan Al-Shedivat
Maruan Al-Shedivat
Genesis Therapeutics
Artificial IntelligenceMachine Learning
Aleksandra Faust
Aleksandra Faust
Genesis Therapeutics, Burlingame, CA
Reinforcement LearningFoundation ModelsAutonomyMachine LearningAGI
E
Evan N. Feinberg
Genesis Molecular AI
M
Michael V. LeVine
Genesis Molecular AI
M
Matteus Pan
Genesis Molecular AI