Slack More, Predict Better: Proximal Relaxation for Probabilistic Latent Variable Model-based Soft Sensors

📅 2026-03-11
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work proposes KProxNPLVM, a novel approach to soft sensor modeling that addresses the accuracy limitations of traditional nonlinear probabilistic latent variable models (NPLVMs) caused by amortized variational inference with parametric posterior approximations. By introducing a proximal relaxation of the objective function into NPLVM training and employing the Wasserstein distance to construct a nonparametric variational inference strategy, the method circumvents the constraints of finite-dimensional parameter spaces. This theoretically eliminates approximation error and ensures convergence. Experimental results on both synthetic and real-world industrial datasets demonstrate that KProxNPLVM significantly outperforms existing models, achieving substantial improvements in prediction accuracy and robustness for soft sensing applications.

Technology Category

Application Category

📝 Abstract
Nonlinear Probabilistic Latent Variable Models (NPLVMs) are a cornerstone of soft sensor modeling due to their capacity for uncertainty delineation. However, conventional NPLVMs are trained using amortized variational inference, where neural networks parameterize the variational posterior. While facilitating model implementation, this parameterization converts the distributional optimization problem within an infinite-dimensional function space to parameter optimization within a finite-dimensional parameter space, which introduces an approximation error gap, thereby degrading soft sensor modeling accuracy. To alleviate this issue, we introduce KProxNPLVM, a novel NPLVM that pivots to relaxing the objective itself and improving the NPLVM's performance. Specifically, we first prove the approximation error induced by the conventional approach. Based on this, we design the Wasserstein distance as the proximal operator to relax the learning objective, yielding a new variational inference strategy derived from solving this relaxed optimization problem. Based on this foundation, we provide a rigorous derivation of KProxNPLVM's optimization implementation, prove the convergence of our algorithm can finally sidestep the approximation error, and propose the KProxNPLVM by summarizing the abovementioned content. Finally, extensive experiments on synthetic and real-world industrial datasets are conducted to demonstrate the efficacy of the proposed KProxNPLVM.
Problem

Research questions and friction points this paper is trying to address.

Nonlinear Probabilistic Latent Variable Models
amortized variational inference
approximation error
soft sensors
distributional optimization
Innovation

Methods, ideas, or system contributions that make the work stand out.

Proximal Relaxation
Wasserstein Distance
Variational Inference
Nonlinear Probabilistic Latent Variable Models
Soft Sensors
🔎 Similar Papers
No similar papers found.
Z
Zehua Zou
Hangzhou International Innovation Institute, Beihang University, Hangzhou 311115, China
Y
Yiran Ma
State Key Laboratory of Industrial Control Technology, College of Control Science and Engineering, Zhejiang University, Hangzhou 310027, Zhejiang, China
Yulong Zhang
Yulong Zhang
Google
Security and Privacy
Zhengnan Li
Zhengnan Li
The Chinese University of Hong Kong, Shenzhen
Time Series ForecastingContinual Learning
Zeyu Yang
Zeyu Yang
Huzhou University
Industrial Big DataData-driven ModelingSoft SensorProcess Monitoring
J
Jinhao Xie
MOE Key Laboratory of Bioinorganic and Synthetic Chemistry, Key Lab of Low-Carbon Chem & Energy Conservation of Guangdong Province, School of Chemistry, Sun Yat-Sen University, Guangzhou 510275, China
Xiaoyu Jiang
Xiaoyu Jiang
Associate Professor (Research), Beihang University
Deep learningIndustrial IntelligenceAI security
Z
Zhichao Chen
National Key Lab of General AI, School of Intelligence Science and Technology, Peking University, Beijing 100871, China