🤖 AI Summary
This study addresses the problem of efficient and stable approximation of multivariate functions by proposing a novel multivariate neural network operator based on a max-min structure, extended for the first time to the multidimensional setting and equipped with sigmoidal activation functions. By introducing the modulus of continuity and multidimensional generalized absolute moments, the authors establish rigorous pointwise and uniform convergence theories for the proposed operator and derive quantitative estimates of the approximation order. The method combines algebraic simplicity with numerical stability, offering a new theoretical framework and practical tool for multivariate function approximation.
📝 Abstract
In this paper, we develop a multivariate framework for approximation by max-min neural network operators. Building on the recent advances in approximation theory by neural network operators, particularly, the univariate max-min operators, we propose and analyze new multivariate operators activated by sigmoidal functions. We establish pointwise and uniform convergence theorems and derive quantitative estimates for the order of approximation via modulus of continuity and multivariate generalized absolute moment. Our results demonstrate that multivariate max-min structure of operators, besides their algebraic elegance, provide efficient and stable approximation tools in both theoretical and applied settings.