🤖 AI Summary
This work proposes a unified framework for describing evolutionary dynamics in both genotype and phenotype spaces, revealing evolution as an adaptive learning process with an intrinsic geometric structure. By integrating the principle of maximum entropy with Riemannian geometry, the authors formulate a generally covariant evolutionary dynamics in which the replicator equation is interpreted as a covariant gradient ascent on a fitness landscape. A key innovation is the establishment of a direct correspondence between the inverse metric tensor and the covariance matrix of microscopic noise, demonstrating that evolution emerges as a geometric learning process driven by underlying stochastic dynamics. This theoretical insight opens new experimental avenues: by measuring the covariance of evolutionary trajectories, one can directly infer both the metric structure of the fitness landscape and the characteristics of the driving noise.
📝 Abstract
We develop a generally covariant description of evolutionary dynamics that operates consistently in both genotype and phenotype spaces. We show that the maximum entropy principle yields a fundamental identification between the inverse metric tensor and the covariance matrix, revealing the Lande equation as a covariant gradient ascent equation. This demonstrates that evolution can be modeled as a learning process on the fitness landscape, with the specific learning algorithm determined by the functional relation between the metric tensor and the noise covariance arising from microscopic dynamics. While the metric (or the inverse genotypic covariance matrix) has been extensively characterized empirically, the noise covariance and its associated observable (the covariance of evolutionary changes) have never been directly measured. This poses the experimental challenge of determining the functional form relating metric to noise covariance.