🤖 AI Summary
Systematic (epistemic) uncertainties arising from modeling deficiencies in high-energy physics simulators remain difficult to quantify rigorously.
Method: This project introduces a novel paradigm integrating physics-informed priors with bias-aware machine learning. We construct the first large-scale AI competition platform dedicated to systematic uncertainty quantification, incorporating Monte Carlo simulation, surrogate modeling, and uncertainty calibration techniques to enable robust parameter inference on biased simulation data.
Contributions/Results: We release the first standardized uncertainty benchmark dataset for Higgs physics; develop an interpretable, calibratable framework for systematic error assessment; and significantly improve model robustness against simulator misspecification and accuracy of uncertainty estimation. These advances establish a trustworthy AI pathway for precision measurements in particle physics.
📝 Abstract
The FAIR Universe -- HiggsML Uncertainty Challenge focuses on measuring the physics properties of elementary particles with imperfect simulators due to differences in modelling systematic errors. Additionally, the challenge is leveraging a large-compute-scale AI platform for sharing datasets, training models, and hosting machine learning competitions. Our challenge brings together the physics and machine learning communities to advance our understanding and methodologies in handling systematic (epistemic) uncertainties within AI techniques.