🤖 AI Summary
This work addresses the critical lack of publicly available, high-resolution power consumption data for generative AI workloads, which hinders accurate data center energy estimation and infrastructure planning. The study presents fine-grained (0.1-second interval) empirical measurements of power draw during training, fine-tuning, and inference tasks on a high-performance computing cluster equipped with NVIDIA H100 GPUs. By integrating standardized benchmarks from MLCommons and vLLM, the authors construct representative workload profiles and extend them to facility-scale energy modeling through an event-driven, bottom-up approach, yielding dynamic power consumption traces that capture real-world temporal fluctuations. This effort delivers the first open, high-resolution dataset on generative AI power usage and introduces a reproducible, scalable methodology to support grid integration and distributed energy resource planning.
📝 Abstract
The rapid growth of generative artificial intelligence (AI) has introduced unprecedented computational demands, driving significant increases in the energy footprint of data centers. However, existing power consumption data is largely proprietary and reported at varying resolutions, creating challenges for estimating whole-facility energy use and planning infrastructure. In this work, we present a methodology that bridges this gap by linking high-resolution workload power measurements to whole-facility energy demand. Using NLR's high-performance computing data center equipped with NVIDIA H100 GPUs, we measure power consumption of AI workloads at 0.1-second resolution for AI training, fine-tuning and inference jobs. Workloads are characterized using MLCommons benchmarks for model training and fine-tuning, and vLLM benchmarks for inference, enabling reproducible and standardized workload profiling. The dataset of power consumption profiles is made publicly available. These power profiles are then scaled to the whole-facility-level using a bottom-up, event-driven, data center energy model. The resulting whole-facility energy profiles capture realistic temporal fluctuations driven by AI workloads and user-behavior, and can be used to inform infrastructure planning for grid connection, on-site energy generation, and distributed microgrids.