Impact of Tuning Parameters in Deep Convolutional Neural Network Using a Crack Image Dataset

📅 2025-05-30
📈 Citations: 1
Influential: 1
📄 PDF
🤖 AI Summary
This study systematically investigates the impact of hyperparameter combinations on the performance of deep convolutional neural networks (DCNNs) in binary classification of crack images. A lightweight DCNN architecture—comprising two convolutional layers, two pooling layers, one Dropout layer, and one fully connected layer—is designed and evaluated quantitatively on a balanced dataset of positive (crack) and negative (non-crack) images. The synergistic effects of three key hyperparameters are analyzed: pooling strategy (MaxPooling vs. AveragePooling), activation function (tanh vs. ReLU), and optimizer (Adam vs. SGD). Empirical results reveal, for the first time, that the combination MaxPooling + tanh + Adam yields statistically significant improvements in classification accuracy—outperforming the second-best configuration by a notable margin. This finding establishes a reproducible, computationally efficient hyperparameter tuning paradigm tailored to industrial defect recognition tasks, thereby addressing a critical gap in the quantitative, systematic analysis of hyperparameters for crack detection.

Technology Category

Application Category

📝 Abstract
The performance of a classifier depends on the tuning of its parame ters. In this paper, we have experimented the impact of various tuning parameters on the performance of a deep convolutional neural network (DCNN). In the ex perimental evaluation, we have considered a DCNN classifier that consists of 2 convolutional layers (CL), 2 pooling layers (PL), 1 dropout, and a dense layer. To observe the impact of pooling, activation function, and optimizer tuning pa rameters, we utilized a crack image dataset having two classes: negative and pos itive. The experimental results demonstrate that with the maxpooling, the DCNN demonstrates its better performance for adam optimizer and tanh activation func tion.
Problem

Research questions and friction points this paper is trying to address.

Impact of tuning parameters on DCNN performance
Evaluating pooling, activation, optimizer effects on crack images
Optimizing DCNN for crack detection using specific parameters
Innovation

Methods, ideas, or system contributions that make the work stand out.

Used deep convolutional neural network (DCNN)
Tested pooling, activation, optimizer parameters
Maxpooling with adam optimizer performed best
🔎 Similar Papers
No similar papers found.