🤖 AI Summary
This work addresses the challenge that protein fitness optimization is inherently a discrete combinatorial problem, yet existing approaches predominantly rely on continuous representations and focus narrowly on prediction accuracy, hindering efficient optimization. The authors propose a representation-centric modeling paradigm that maps pretrained protein language models into structured binary latent spaces—such as those derived from PCA or autoencoders—and constructs a QUBO (Quadratic Unconstrained Binary Optimization) surrogate model to capture first- and second-order residue interactions. Classical combinatorial optimization techniques, including simulated annealing and genetic algorithms, are then employed for effective search. Experiments demonstrate that structured binary representations like PCA yield high-entropy, decodable, and optimization-friendly landscapes, significantly enhancing performance across multiple datasets and effectively bridging modern machine learning with discrete and quantum-inspired optimization methods.
📝 Abstract
Protein fitness optimization is inherently a discrete combinatorial problem, yet most learning-based approaches rely on continuous representations and are primarily evaluated through predictive accuracy. We introduce Q-BIOLAT, a framework for modeling and optimizing protein fitness landscapes in compact binary latent spaces. Starting from pretrained protein language model embeddings, we construct binary latent representations and learn a quadratic unconstrained binary optimization (QUBO) surrogate that captures unary and pairwise interactions.
Beyond its formulation, Q-BIOLAT provides a representation-centric perspective on protein fitness modeling. We show that representations with similar predictive performance can induce fundamentally different optimization landscapes. In particular, learned autoencoder-based representations collapse after binarization, producing degenerate latent spaces that fail to support combinatorial search, whereas simple structured representations such as PCA yield high-entropy, decodable, and optimization-friendly latent spaces.
Across multiple datasets and data regimes, we demonstrate that classical combinatorial optimization methods, including simulated annealing, genetic algorithms, and greedy hill climbing, are highly effective in structured binary latent spaces. By expressing the objective in QUBO form, our approach connects modern machine learning with discrete and quantum-inspired optimization.
Our implementation and dataset are publicly available at: https://github.com/HySonLab/Q-BIOLAT-Extended