🤖 AI Summary
Quantifying the cumulative growth trajectory of AI models on Hugging Face—and their latent innovation potential—remains challenging, particularly in modeling how downstream fine-tuning propagates influence over time.
Method: We adapt the three-parameter citation dynamics framework (immediacy, durability, relative fitness) to model influence, integrating Wang et al.’s citation growth model with the Hugging Face model lineage and fine-tuning dependency graph to jointly estimate parameters and classify evolutionary trajectories.
Contribution/Results: This work introduces the first interpretable, dynamics-based model influence evolution framework. It accurately identifies critical patterns—including sudden growth surges, long-tail latency periods, and innovation inflection points—while providing standardized, quantitative metrics for ecosystem assessment and technology forecasting. Empirical evaluation demonstrates high fidelity in fitting adoption pathways of major models and robust detection of anomalous influence patterns.
📝 Abstract
As the open-weight AI landscape continues to proliferate-with model development, significant investment, and user interest-it becomes increasingly important to predict which models will ultimately drive innovation and shape AI ecosystems. Building on parallels with citation dynamics in scientific literature, we propose a framework to quantify how an open-weight model's influence evolves. Specifically, we adapt the model introduced by Wang et al. for scientific citations, using three key parameters-immediacy, longevity, and relative fitness-to track the cumulative number of fine-tuned models of an open-weight model. Our findings reveal that this citation-style approach can effectively capture the diverse trajectories of open-weight model adoption, with most models fitting well and outliers indicating unique patterns or abrupt jumps in usage.