🤖 AI Summary
This work addresses the challenge of efficiently pretraining extremely large language models in a permissionless, decentralized global environment. The authors propose a trust-minimized collaborative framework built on blockchain, integrating a communication-efficient SparseLoCo optimizer and a dynamic node participation mechanism. This system enables the first globally coordinated pretraining of a 72B-parameter model without requiring a whitelist of trusted participants. It supports dynamic node join and exit while training on approximately 1.1 trillion tokens, achieving model performance comparable to—or even exceeding—that of centralized approaches with similar or higher computational budgets. The approach thus overcomes critical scalability and performance bottlenecks that have previously hindered large-scale decentralized training.
📝 Abstract
Recently, there has been increased interest in globally distributed training, which has the promise to both reduce training costs and democratize participation in building large-scale foundation models. However, existing models trained in a globally distributed manner are relatively small in scale and have only been trained with whitelisted participants. Therefore, they do not yet realize the full promise of democratized participation. In this report, we describe Covenant-72B, an LLM produced by the largest collaborative globally distributed pre-training run (in terms of both compute and model scale), which simultaneously allowed open, permissionless participation supported by a live blockchain protocol. We utilized a state-of-the-art communication-efficient optimizer, SparseLoCo, supporting dynamic participation with peers joining and leaving freely. Our model, pre-trained on approximately 1.1T tokens, performs competitively with fully centralized models pre-trained on similar or higher compute budgets, demonstrating that fully democratized, non-whitelisted participation is not only feasible, but can be achieved at unprecedented scale for a globally distributed pre-training run.