CoRe: Coherency Regularization for Hierarchical Time Series

📅 2025-02-21
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses the aggregation inconsistency problem in hierarchical time series forecasting caused by noisy data. We propose a soft consistency regularization method that inherently enforces hierarchical coherence in neural network predictions, eliminating the need for post-hoc reconciliation or hard constraints. Our key contributions are threefold: (i) we provide the first theoretical guarantee for soft-coherent training; (ii) we design a gradient-based hierarchical consistency regularizer enabling end-to-end optimization; and (iii) our method is model-agnostic—compatible with probabilistic generative models such as DeepAR and N-BEATS—and plug-and-play. The approach exhibits strong robustness to measurement errors and missing values. Empirical evaluation across multiple benchmark datasets shows that overall hierarchical forecast accuracy is maintained or improved, while out-of-sample coherence significantly surpasses existing soft methods—particularly under high-noise conditions, where generalization performance is markedly superior.

Technology Category

Application Category

📝 Abstract
Hierarchical time series forecasting presents unique challenges, particularly when dealing with noisy data that may not perfectly adhere to aggregation constraints. This paper introduces a novel approach to soft coherency in hierarchical time series forecasting using neural networks. We present a network coherency regularization method, which we denote as CoRe (Coherency Regularization), a technique that trains neural networks to produce forecasts that are inherently coherent across hierarchies, without strictly enforcing aggregation constraints. Our method offers several key advantages. (1) It provides theoretical guarantees on the coherency of forecasts, even for out-of-sample data. (2) It is adaptable to scenarios where data may contain errors or missing values, making it more robust than strict coherency methods. (3) It can be easily integrated into existing neural network architectures for time series forecasting. We demonstrate the effectiveness of our approach on multiple benchmark datasets, comparing it against state-of-the-art methods in both coherent and noisy data scenarios. Additionally, our method can be used within existing generative probabilistic forecasting frameworks to generate coherent probabilistic forecasts. Our results show improved generalization and forecast accuracy, particularly in the presence of data inconsistencies. On a variety of datasets, including both strictly hierarchically coherent and noisy data, our training method has either equal or better accuracy at all levels of the hierarchy while being strictly more coherent out-of-sample than existing soft-coherency methods.
Problem

Research questions and friction points this paper is trying to address.

Soft coherency in hierarchical time series
Neural network coherency regularization
Handling noisy and incomplete data
Innovation

Methods, ideas, or system contributions that make the work stand out.

Neural network coherency regularization
Soft coherency in hierarchical forecasting
Adaptable to noisy or missing data