🤖 AI Summary
To address the need for large-scale, real-time plant disease detection in agricultural fields, this paper proposes a lightweight, end-to-end method integrating quad-tree image decomposition with multi-scale deep feature learning. The approach introduces a novel synergistic mechanism between adaptive quad-tree partitioning and deep neural networks, preserving the computational efficiency of traditional image processing while fully leveraging the discriminative power of DNNs—thereby overcoming the classic accuracy–speed trade-off. It jointly optimizes adaptive quad-tree segmentation, multi-scale feature extraction, and a compact network architecture to enable unified disease localization and classification. Evaluated on a benchmark dataset comprising four common diseases of potato and tomato, the method achieves an F1-score of 0.80, with low inference latency and minimal computational overhead. Its efficiency enables direct deployment on resource-constrained edge devices, including agricultural drones and field robots.
📝 Abstract
Over the past decade, several image-processing methods and algorithms have been proposed for identifying plant diseases based on visual data. DNN (Deep Neural Networks) have recently become popular for this task. Both traditional image processing and DNN-based methods encounter significant performance issues in real-time detection owing to computational limitations and a broad spectrum of plant disease features. This article proposes a novel technique for identifying and localising plant disease based on the Quad-Tree decomposition of an image and feature learning simultaneously. The proposed algorithm significantly improves accuracy and faster convergence in high-resolution images with relatively low computational load. Hence it is ideal for deploying the algorithm in a standalone processor in a remotely operated image acquisition and disease detection system, ideally mounted on drones and robots working on large agricultural fields. The technique proposed in this article is hybrid as it exploits the advantages of traditional image processing methods and DNN-based models at different scales, resulting in faster inference. The F1 score is approximately 0.80 for four disease classes corresponding to potato and tomato crops.