Broad stochastic configuration residual learning system for norm-convergent universal approximation

📅 2025-11-20
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing randomized learning networks (e.g., BRLS) achieve only probabilistic universal approximation, lacking norm-convergence guarantees. To address this, this paper proposes the Broad Stochastic Configuration Residual Learning System (BSCRLS). Its core innovation is the first introduction of an adaptive supervised parameter-constraint mechanism into the broad residual learning framework, with theoretical proof of norm-based universal approximation capability. Furthermore, three incremental learning algorithms are designed to enable efficient model updating under diverse deployment scenarios. Extensive experiments on a public solar-panel dust-accumulation detection dataset demonstrate that BSCRLS significantly outperforms 13 state-of-the-art deep and broad learning methods, validating both its rigorous convergence properties and practical superiority.

Technology Category

Application Category

📝 Abstract
Universal approximation serves as the foundation of neural network learning algorithms. However, some networks establish their universal approximation property by demonstrating that the iterative errors converge in probability measure rather than the more rigorous norm convergence, which makes the universal approximation property of randomized learning networks highly sensitive to random parameter selection, Broad residual learning system (BRLS), as a member of randomized learning models, also encounters this issue. We theoretically demonstrate the limitation of its universal approximation property, that is, the iterative errors do not satisfy norm convergence if the selection of random parameters is inappropriate and the convergence rate meets certain conditions. To address this issue, we propose the broad stochastic configuration residual learning system (BSCRLS) algorithm, which features a novel supervisory mechanism adaptively constraining the range settings of random parameters on the basis of BRLS framework, Furthermore, we prove the universal approximation theorem of BSCRLS based on the more stringent norm convergence. Three versions of incremental BSCRLS algorithms are presented to satisfy the application requirements of various network updates. Solar panels dust detection experiments are performed on publicly available dataset and compared with 13 deep and broad learning algorithms. Experimental results reveal the effectiveness and superiority of BSCRLS algorithms.
Problem

Research questions and friction points this paper is trying to address.

Randomized learning networks lack norm convergence guarantees
Inappropriate parameter selection limits universal approximation capability
Broad residual learning systems require stricter convergence criteria
Innovation

Methods, ideas, or system contributions that make the work stand out.

Novel supervisory mechanism for random parameter constraints
Three incremental algorithm versions for network updates
Norm-convergent universal approximation with stricter convergence
🔎 Similar Papers
2024-10-02arXiv.orgCitations: 0
H
Han Su
School of Control and Computer Engineering, North China Electric Power University, Beijing, 102206, China
Z
Zhongyan Li
School of Mathematics and Physics, North China Electric Power University, Beijing, 102206, China
Wanquan Liu
Wanquan Liu
Sun Yat-sen University
Computer visionIntelligent controlPattern recognition