🤖 AI Summary
To address the core challenge of high energy consumption in AI data centers—impeding both environmental sustainability and operational economics—this paper proposes a dynamic power supply response mechanism that, for the first time, treats input electricity as a tunable variable, enabling real-time synchronization with AI workload computational power demand. Methodologically, it integrates passive and active device performance analysis, multi-source global electricity trend modeling, and cross-platform comparative evaluation. Compared to conventional fixed-power-supply paradigms, the mechanism improves energy efficiency ratio by +23.6%, enhances computational gain, and reduces capital expenditure (CAPEX) and operational expenditure (OPEX) by approximately 18%. Its engineering feasibility is validated on a ten-thousand-GPU-scale AI cluster. This work establishes a novel “workload-driven power management” paradigm, delivering a deployable technical pathway and a systematic evaluation framework to advance green and scalable AI infrastructure development.
📝 Abstract
The steady growth of artificial intelligence (AI) has accelerated in the recent years, facilitated by the development of sophisticated models such as large language models and foundation models. Ensuring robust and reliable power infrastructures is fundamental to take advantage of the full potential of AI. However, AI data centres are extremely hungry for power, putting the problem of their power management in the spotlight, especially with respect to their impact on environment and sustainable development. In this work, we investigate the capacity and limits of solutions based on an innovative approach for the power management of AI data centres, i.e., making part of the input power as dynamic as the power used for data-computing functions. The performance of passive and active devices are quantified and compared in terms of computational gain, energy efficiency, reduction of capital expenditure, and management costs by analysing power trends from multiple data platforms worldwide. This strategy, which identifies a paradigm shift in the AI data centre power management, has the potential to strongly improve the sustainability of AI hyperscalers, enhancing their footprint on environmental, financial, and societal fields.