🤖 AI Summary
Industrial battery management faces challenges in jointly modeling multiple tasks—such as state-of-charge (SOC) and state-of-health (SOH) estimation, remaining useful life (RUL) prediction, and fault diagnosis—under heterogeneous sensor data characterized by multi-timescale dynamics, varying sensor resolutions, and incomplete channel observations. This heterogeneity impedes unified representation learning, resulting in high development costs and poor generalizability of task-specific models. To address this, we propose the Flexible Masked Autoencoder (FMAE), a novel pretraining framework that enables robust modeling of partially observed multichannel inputs for the first time, while supporting unified representation learning across timescales and sensor modalities. Leveraging multitask pretraining and cross-modal representation learning, FMAE achieves state-of-the-art performance across 11 benchmark datasets: it attains SOTA RUL prediction accuracy using only 2% of inference-time data, and sustains near-lossless performance under system voltage channel dropout. The method significantly reduces data dependency and engineering overhead, demonstrating strong transferability and industrial deployability.
📝 Abstract
Industrial-scale battery management involves various types of tasks, such as estimation, prediction, and system-level diagnostics. Each task employs distinct data across temporal scales, sensor resolutions, and data channels. Building task-specific methods requires a great deal of data and engineering effort, which limits the scalability of intelligent battery management. Here we present the Flexible Masked Autoencoder (FMAE), a flexible pretraining framework that can learn with missing battery data channels and capture inter-correlations across data snippets. FMAE learns unified battery representations from heterogeneous data and can be adopted by different tasks with minimal data and engineering efforts. Experimentally, FMAE consistently outperforms all task-specific methods across five battery management tasks with eleven battery datasets. On remaining life prediction tasks, FMAE uses 50 times less inference data while maintaining state-of-the-art results. Moreover, when real-world data lack certain information, such as system voltage, FMAE can still be applied with marginal performance impact, achieving comparable results with the best hand-crafted features. FMAE demonstrates a practical route to a flexible, data-efficient model that simplifies real-world multi-task management of dynamical systems.