Measuring Neural Network Complexity via Effective Degrees of Freedom

📅 2026-02-13
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study addresses the challenge of effectively quantifying the complexity of feedforward neural networks, which arises from their nonlinear, hierarchical architecture and high parameter count. The authors propose the first application of generalized degrees of freedom (GDF)—originally developed for models with discrete responses—to assess complexity in binary classification feedforward networks. GDF measures sensitivity of fitted values to perturbations in observed responses, circumventing the need for likelihood-based assumptions. In both simulated and real-data experiments, GDF demonstrates more robust complexity estimation compared to likelihood-based cross-validation and Landsittel et al.’s zero-degrees-of-freedom approach, particularly under model misspecification. This work thus provides a reliable, likelihood-free metric for evaluating neural network complexity.

Technology Category

Application Category

📝 Abstract
Quantifying the complexity of feed-forward neural networks (FFNNs) remains challenging due to their nonlinear, hierarchical structure and numerous parameters. We apply generalized degrees of freedom (GDF) to measure model complexity in FFNNs with binary outcomes, adapting the algorithm for discrete responses. We compare GDF with both the effective number of parameters derived via log-likelihood cross-validation and the null degrees of freedom of Landsittel et al. Through simulation studies and a real data analysis, we demonstrate that GDF provides a robust assessment of model complexity for neural network models, as it depends only on the sensitivity of fitted values to perturbations in the observed responses rather than on assumptions about the likelihood. In contrast, cross-validation-based estimates of model complexity and the null degrees of freedom rely on the correctness of the assumed likelihood and may exhibit substantial variability. We find that GDF, cross-validation-based measures, and null degrees of freedom yield similar assessments of model complexity only when the fitted model adequately represents the data-generating mechanism. These findings highlight GDF as a stable and broadly applicable measure of model complexity for neural networks in statistical modeling.
Problem

Research questions and friction points this paper is trying to address.

neural network complexity
effective degrees of freedom
model complexity
feed-forward neural networks
generalized degrees of freedom
Innovation

Methods, ideas, or system contributions that make the work stand out.

Generalized Degrees of Freedom
Neural Network Complexity
Model Complexity
Binary Outcomes
Robustness
🔎 Similar Papers
No similar papers found.