Property Testing of Computational Networks

📅 2025-12-08
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper investigates property testing for weighted computational networks—modeled as ReLU neural Boolean networks: given black-box query access, how can one efficiently determine whether the network computes a target function (or satisfies a property), or is (ε,δ)-far from it? We propose a distribution-free testing framework and prove that nearly constant functions admit query complexity that is nearly constant in the network size—achieving, for the first time, decoupling of testability from network scale. This result exposes a fundamental limitation of distribution-free generalization: such testability collapses under natural model extensions. Our core contribution is the first parameterized, falsifiable property testing theory for weighted neural networks, with tight upper and lower bounds precisely characterizing the feasibility frontier of testing.

Technology Category

Application Category

📝 Abstract
In this paper we initiate the study of emph{property testing of weighted computational networks viewed as computational devices}. Our goal is to design property testing algorithms that for a given computational network with oracle access to the weights of the network, accept (with probability at least $frac23$) any network that computes a certain function (or a function with a certain property) and reject (with probability at least $frac23$) any network that is emph{far} from computing the function (or any function with the given property). We parameterize the notion of being far and want to reject networks that are emph{$(ε,δ)$-far}, which means that one needs to change an $ε$-fraction of the description of the network to obtain a network that computes a function that differs in at most a $δ$-fraction of inputs from the desired function (or any function with a given property). To exemplify our framework, we present a case study involving simple neural Boolean networks with ReLU activation function. As a highlight, we demonstrate that for such networks, any near constant function is testable in query complexity independent of the network's size. We also show that a similar result cannot be achieved in a natural generalization of the distribution-free model to our setting, and also in a related vanilla testing model.
Problem

Research questions and friction points this paper is trying to address.

Testing computational networks for function correctness via property testing algorithms
Defining and detecting networks far from computing desired functions using parameters
Demonstrating testability for neural Boolean networks with size-independent query complexity
Innovation

Methods, ideas, or system contributions that make the work stand out.

Property testing algorithms for weighted computational networks
Parameterized farness using (ε,δ)-distance metrics
Query complexity independent of network size for near-constant functions
🔎 Similar Papers
No similar papers found.