Trading Carbon for Physics: On the Resource Efficiency of Machine Learning for Spatio-Temporal Forecasting

📅 2025-09-29
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Contemporary deep learning research overemphasizes predictive accuracy, leading to oversized models, escalating computational costs, and substantial carbon emissions—particularly detrimental for physics-constrained spatiotemporal forecasting. Method: We propose a principled modeling paradigm that systematically incorporates physical inductive biases—such as conservation laws and symmetries—into both model architecture and training objectives. Our approach integrates physics-informed neural networks with flow matching to construct differentiable, interpretable, and physically consistent predictors. Contribution/Results: Evaluated on multiple real-world spatiotemporal forecasting benchmarks, our method matches or exceeds state-of-the-art accuracy while drastically reducing parameter count, inference latency, and energy consumption—achieving an average carbon footprint reduction of over 40%. This work establishes generalizable design principles and empirical evidence for energy-efficient, environmentally sustainable scientific machine learning.

Technology Category

Application Category

📝 Abstract
Development of modern deep learning methods has been driven primarily by the push for improving model efficacy (accuracy metrics). This sole focus on efficacy has steered development of large-scale models that require massive resources, and results in considerable carbon footprint across the model life-cycle. In this work, we explore how physics inductive biases can offer useful trade-offs between model efficacy and model efficiency (compute, energy, and carbon). We study a variety of models for spatio-temporal forecasting, a task governed by physical laws and well-suited for exploring different levels of physics inductive bias. We show that embedding physics inductive biases into the model design can yield substantial efficiency gains while retaining or even improving efficacy for the tasks under consideration. In addition to using standard physics-informed spatio-temporal models, we demonstrate the usefulness of more recent models like flow matching as a general purpose method for spatio-temporal forecasting. Our experiments show that incorporating physics inductive biases offer a principled way to improve the efficiency and reduce the carbon footprint of machine learning models. We argue that model efficiency, along with model efficacy, should become a core consideration driving machine learning model development and deployment.
Problem

Research questions and friction points this paper is trying to address.

Addressing high carbon footprint in large-scale spatio-temporal forecasting models
Exploring physics inductive biases for efficiency-accuracy trade-offs
Reducing computational resource requirements while maintaining model performance
Innovation

Methods, ideas, or system contributions that make the work stand out.

Physics inductive biases improve model efficiency
Flow matching enables spatio-temporal forecasting
Balancing efficacy and efficiency reduces carbon footprint
🔎 Similar Papers
No similar papers found.
S
Sophia N. Wilson
Department of Computer Science, University of Copenhagen, Denmark
J
Jens Hesselbjerg Christensen
Niels Bohr Institute, University of Copenhagen, Denmark
Raghavendra Selvan
Raghavendra Selvan
Assistant Professor (TT), University of Copenhagen
Sustainable AIEfficient Machine LearningMedical Image AnalysisAI for Sciences