Estimating Time-Varying Epidemic Severity Rates with Adaptive Deconvolution

📅 2025-10-17
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In public health, time-varying epidemic severity measures—such as case fatality ratios—are confounded by viral evolution, interventions, and reporting delays, rendering conventional ratio estimators substantially biased. To address this, we propose an adaptive deconvolution framework regularized by trend filtering, enabling robust extraction of dynamic risk signals from delayed and aggregated count data. Grounded in a Poisson–binomial statistical model, our method employs an approximate maximum likelihood solution to yield locally adaptive, smooth estimates of time-varying severity. It effectively corrects for both reporting delays and aggregation bias, outperforming standard ratio-based approaches on both real-world and simulated COVID-19 data, while exhibiting robustness to model misspecification. Our key contribution is the first integration of trend filtering into deconvolution-based estimation, yielding a dynamic severity modeling framework that simultaneously ensures theoretical interpretability, computational tractability, and suitability for near-real-time analysis.

Technology Category

Application Category

📝 Abstract
Several key metrics in public health convey the probability that a primary event will lead to a more serious secondary event in the future. These "severity rates" can change over the course of an epidemic in response to shifting conditions like new therapeutics, variants, or public health interventions. In practice, time-varying parameters such as the case-fatality rate are typically estimated from aggregate count data. Prior work has demonstrated that commonly-used ratio-based estimators can be highly biased, motivating the development of new methods. In this paper, we develop an adaptive deconvolution approach based on approximating a Poisson-binomial model for secondary events, and we regularize the maximum likelihood solution in this model with a trend filtering penalty to produce smooth but locally adaptive estimates of severity rates over time. This enables us to compute severity rates both retrospectively and in real time. Experiments based on COVID-19 death and hospitalization data, both real and simulated, demonstrate that our deconvolution estimator is generally more accurate than the standard ratio-based methods, and displays reasonable robustness to model misspecification.
Problem

Research questions and friction points this paper is trying to address.

Estimating time-varying epidemic severity rates from aggregate count data
Correcting bias in ratio-based estimators for secondary event probabilities
Developing adaptive deconvolution method for real-time severity monitoring
Innovation

Methods, ideas, or system contributions that make the work stand out.

Adaptive deconvolution estimates time-varying epidemic severity rates
Poisson-binomial model with trend filtering regularization
Real-time retrospective computation using COVID-19 data
🔎 Similar Papers
No similar papers found.
Jeremy Goldwasser
Jeremy Goldwasser
Statistics PhD Student, UC Berkeley
NLPInterpretabilityStatistical ML
A
Addison J. Hu
Department of Statistics and Machine Learning Department, Carnegie Mellon University
Alyssa Bilinski
Alyssa Bilinski
Departments of Health Policy and Biostatistics, Brown University
Daniel J. McDonald
Daniel J. McDonald
Professor of Statistics, University of British Columbia
R
Ryan J. Tibshirani
Department of Statistics, University of California, Berkeley