Efficient Estimation of Unfactorizable Systematic Uncertainties

📅 2025-09-18
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In high-dimensional physical data (e.g., from LHC experiments), systematic uncertainties often cannot be factorized across input dimensions, yet existing uncertainty quantification methods impose restrictive factorizability assumptions—leading to significant inaccuracies. Method: We propose a factorization-free, computationally efficient framework for systematic uncertainty quantification. It employs a derivative-enhanced surrogate model based on Gaussian process regression and integrates Bayesian experimental design for adaptive sampling. Contribution/Results: Our approach substantially improves both accuracy and computational efficiency in estimating non-factorizable systematic errors. In representative benchmarks, it achieves up to 40% lower estimation error than conventional random or grid-based sampling—using fewer evaluation points. The framework scales effectively to high-dimensional settings and constitutes the first methodology that simultaneously ensures theoretical rigor and practical applicability for systematic error modeling in complex experimental physics.

Technology Category

Application Category

📝 Abstract
Accurate assessment of systematic uncertainties is an increasingly vital task in physics studies, where large, high-dimensional datasets, like those collected at the Large Hadron Collider, hold the key to new discoveries. Common approaches to assessing systematic uncertainties rely on simplifications, such as assuming that the impact of the various sources of uncertainty factorizes. In this paper, we provide realistic example scenarios in which this assumption fails. We introduce an algorithm that uses Gaussian process regression to estimate the impact of systematic uncertainties extit{without} assuming factorization. The Gaussian process models are enhanced with derivative information, which increases the accuracy of the regression without increasing the number of samples. In addition, we present a novel sampling strategy based on Bayesian experimental design, which is shown to be more efficient than random and grid sampling in our example scenarios.
Problem

Research questions and friction points this paper is trying to address.

Estimating unfactorizable systematic uncertainties in physics
Enhancing accuracy without increasing sample size
Developing efficient sampling for high-dimensional data analysis
Innovation

Methods, ideas, or system contributions that make the work stand out.

Gaussian process regression without factorization assumption
Enhanced models with derivative information for accuracy
Novel Bayesian experimental design sampling strategy
🔎 Similar Papers
No similar papers found.
A
Alexis Romero
Department of Physics and Astronomy, University of California, Irvine CA
Kyle Cranmer
Kyle Cranmer
University of Wisconsin-Madison
Particle Physicsdeep learningData ScienceStatisticsOpen Science
D
Daniel Whiteson
Department of Physics and Astronomy, University of California, Irvine CA