🤖 AI Summary
This study addresses the challenges of baseline drift, cross-sensitivity, and response lag in low-cost drone-based sensors for wildfire monitoring, exacerbated by scarce training data. The authors propose PC²DAE, a denoising autoencoder that intrinsically embeds physical constraints—such as non-negative concentration and temporal smoothness—into its architecture, eschewing reliance on penalty terms in the loss function. Through soft-regularized activation, a hierarchical decoding structure, and a physics-guided temporal smoothing mechanism, the framework is instantiated in two variants: a lightweight (21k parameters) and a wide (204k parameters) version. Using only 2.2 hours of flight data, the lean variant achieves a 67.3% improvement in signal smoothness, suppresses 90.7% of high-frequency noise, incurs zero physical violations, and trains in under 65 seconds, significantly outperforming five baseline methods.
📝 Abstract
Wildfire monitoring requires high-resolution atmospheric measurements, yet low-cost sensors on Unmanned Aerial Vehicles (UAVs) exhibit baseline drift, cross-sensitivity, and response lag that corrupt concentration estimates. Traditional deep learning denoising approaches demand large datasets impractical to obtain from limited UAV flight campaigns. We present PC$^2$DAE, a physics-informed denoising autoencoder that addresses data scarcity by embedding physical constraints directly into the network architecture. Non-negative concentration estimates are enforced via softplus activations and physically plausible temporal smoothing, ensuring outputs are physically admissible by construction rather than relying on loss function penalties. The architecture employs hierarchical decoder heads for Black Carbon, Gas, and CO$_2$ sensor families, with two variants: PC$^2$DAE-Lean (21k parameters) for edge deployment and PC$^2$DAE-Wide (204k parameters) for offline processing. We evaluate on 7,894 synchronized 1 Hz samples collected from UAV flights during prescribed burns in Saskatchewan, Canada (approximately 2.2 hours of flight data), two orders of magnitude below typical deep learning requirements. PC$^2$DAE-Lean achieves 67.3\% smoothness improvement and 90.7\% high-frequency noise reduction with zero physics violations. Five baselines (LSTM-AE, U-Net, Transformer, CBDAE, DeSpaWN) produce 15--23\% negative outputs. The lean variant outperforms wide (+5.6\% smoothness), suggesting reduced capacity with strong inductive bias prevents overfitting in data-scarce regimes. Training completes in under 65 seconds on consumer hardware.