🤖 AI Summary
This work proposes GenSR, a novel framework that reformulates symbolic regression as a Bayesian optimization problem aimed at maximizing the conditional distribution \( p(\text{equation} \mid \text{numerical data}) \). Unlike traditional approaches that search in a discrete equation space—where structural modifications are decoupled from numerical behavior, leading to noisy and uninformative fitting errors—GenSR constructs a generative latent space exhibiting both symbolic continuity and local numerical smoothness. The framework employs a dual-branch conditional variational autoencoder (CVAE) to learn structured latent representations of equations and integrates an enhanced CMA-ES algorithm to enable efficient directed optimization through a “map construction → coarse localization → fine search” paradigm. Experiments demonstrate that GenSR simultaneously improves predictive accuracy, expression simplicity, and computational efficiency while maintaining robustness under noisy conditions.
📝 Abstract
Symbolic Regression (SR) tries to reveal the hidden equations behind observed data. However, most methods search within a discrete equation space, where the structural modifications of equations rarely align with their numerical behavior, leaving fitting error feedback too noisy to guide exploration. To address this challenge, we propose GenSR, a generative latent space-based SR framework following the `map construction ->coarse localization ->fine search''paradigm. Specifically, GenSR first pretrains a dual-branch Conditional Variational Autoencoder (CVAE) to reparameterize symbolic equations into a generative latent space with symbolic continuity and local numerical smoothness. This space can be regarded as a well-structured `map''of the equation space, providing directional signals for search. At inference, the CVAE coarsely localizes the input data to promising regions in the latent space. Then, a modified CMA-ES refines the candidate region, leveraging smooth latent gradients. From a Bayesian perspective, GenSR reframes the SR task as maximizing the conditional distribution $p(\mathrm{Equ.} \mid \mathrm{Num.})$, with CVAE training achieving this objective through the Evidence Lower Bound (ELBO). This new perspective provides a theoretical guarantee for the effectiveness of GenSR. Extensive experiments show that GenSR jointly optimizes predictive accuracy, expression simplicity, and computational efficiency, while remaining robust under noise.