🤖 AI Summary
Real-world time-series data often exhibit irregular sampling and mixed-type tokens (categorical, numerical, and textual), posing challenges for unified modeling: discrete tokenization yields weak numerical representations, while neural ODEs struggle with categorical variables and information-rich sampling. This paper introduces the first end-to-end decoder-only Transformer architecture tailored for irregular mixed-type time series. It employs hybrid embedding—combining categorical encoding with numerical projection—and hierarchical loss—jointly optimizing cross-entropy and regression objectives—to generalize next-token prediction into discrete-continuous joint likelihood estimation. No data augmentation or model ensembling is required. Evaluated on physics simulations, electrocardiograms, and multivariate electronic health records, our method significantly outperforms state-of-the-art approaches in both generation and anomaly detection, while improving training efficiency by over 30%.
📝 Abstract
Real-world processes often generate data that are a mix of categorical and numeric values that are recorded at irregular and informative intervals. Discrete token-based approaches are limited in numeric representation capacity while methods like neural ordinary differential equations are not well suited for categorical data or informative sampling and require augmentation to handle certain classes of trajectories. Here, we present multivariateGPT, a single architecture for modeling sequences of mixed categorical (including tokenized text) and numeric data. This is accomplished with an autoregressive sequence decomposition, embedding scheme, and loss function that extend the next token prediction task to likelihood estimation of the joint distribution of next token class and value. We demonstrate how this approach can efficiently learn to generalize patterns in simple physical systems and model complex time series including electrocardiograms and multivariate electronic health record data. This work extends the utility of transformer based models to additional classes of data.