QuantaAlpha: An Evolutionary Framework for LLM-Driven Alpha Mining

📅 2026-02-06
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenges posed by market noise and non-stationarity in financial time series, which render traditional alpha factors susceptible to backtest overfitting and ineffective under regime shifts. To overcome the limitations of existing agent-based frameworks—particularly their lack of controllable multi-round search and experience reuse mechanisms—we propose QuantaAlpha, a novel approach that formulates end-to-end alpha factor discovery as a trajectory optimization problem. By employing trajectory-level mutation and crossover operations, QuantaAlpha identifies suboptimal steps and recombines high-return segments to enable effective pattern reuse. Semantic consistency constraints and complexity control are further integrated to ensure factor interpretability and generalization. Combining large language models with evolutionary algorithms, our method achieves an information coefficient (IC) of 0.1501, annualized return of 27.75%, and maximum drawdown of 7.98% on the CSI 300 index, and demonstrates strong transferability with four-year cumulative excess returns of 160% and 137% on CSI 500 and S&P 500, respectively.

Technology Category

Application Category

📝 Abstract
Financial markets are noisy and non-stationary, making alpha mining highly sensitive to noise in backtesting results and sudden market regime shifts. While recent agentic frameworks improve alpha mining automation, they often lack controllable multi-round search and reliable reuse of validated experience. To address these challenges, we propose QuantaAlpha, an evolutionary alpha mining framework that treats each end-to-end mining run as a trajectory and improves factors through trajectory-level mutation and crossover operations. QuantaAlpha localizes suboptimal steps in each trajectory for targeted revision and recombines complementary high-reward segments to reuse effective patterns, enabling structured exploration and refinement across mining iterations. During factor generation, QuantaAlpha enforces semantic consistency across the hypothesis, factor expression, and executable code, while constraining the complexity and redundancy of the generated factor to mitigate crowding. Extensive experiments on the China Securities Index 300 (CSI 300) demonstrate consistent gains over strong baseline models and prior agentic systems. When utilizing GPT-5.2, QuantaAlpha achieves an Information Coefficient (IC) of 0.1501, with an Annualized Rate of Return (ARR) of 27.75% and a Maximum Drawdown (MDD) of 7.98%. Moreover, factors mined on CSI 300 transfer effectively to the China Securities Index 500 (CSI 500) and the Standard&Poor's 500 Index (S&P 500), delivering 160% and 137% cumulative excess return over four years, respectively, which indicates strong robustness of QuantaAlpha under market distribution shifts.
Problem

Research questions and friction points this paper is trying to address.

alpha mining
financial markets
non-stationarity
noise sensitivity
market regime shifts
Innovation

Methods, ideas, or system contributions that make the work stand out.

evolutionary framework
trajectory-level mutation
semantic consistency
alpha mining
cross-market transfer
🔎 Similar Papers
No similar papers found.
J
Jun Han
SUFE
S
Shuo Zhang
QuantaAlpha
W
Wei Li
SUFE
Z
Zhi Yang
SUFE
Yifan Dong
Yifan Dong
Boise State University
T
Tu Hu
QuantaAlpha
J
Jialuo Yuan
Stanford
X
Xiaomin Yu
QuantaAlpha
Y
Yumo Zhu
SUFE
F
Fangqi Lou
SUFE
Xin Guo
Xin Guo
上海财经大学
大语言模型,共形预测
Z
Zhaowei Liu
SUFE
T
Tianyi Jiang
PKU
Ruichuan An
Ruichuan An
Xi'an Jiaotong University|Peking University
VLMData Centric AI
Jingping Liu
Jingping Liu
ECUST
large language modelknowledge graph
B
Biao Wu
QuantaAlpha
R
Rongze Chen
QuantaAlpha
Kunyi Wang
Kunyi Wang
UBC; KAUST
VisionGraphics
Y
Yifan Wang
QuantaAlpha
S
Sen Hu
QuantaAlpha, PKU
X
Xinbing Kong
SEU
L
Liwen Zhang
SUFE
R
Ronghao Chen
QuantaAlpha, PKU
H
Huacan Wang
QuantaAlpha