๐ค AI Summary
This work proposes an energy-aware lightweight ensemble learning framework to address the challenge of deploying high-accuracy AI models on resource-constrained edge devices. The approach leverages knowledge distillation to transfer knowledge from a large convolutional neural network (CNN) into multiple compact sub-models and employs an optimized ensemble strategy to fuse their predictions. Experimental results on a self-collected coffee leaf disease dataset demonstrate that the proposed method achieves classification accuracy comparable to existing approaches while significantly reducing computational overhead, energy consumption, and carbon footprint. These advantages make the framework particularly suitable for sustainable edge deployment in Internet of Things (IoT) scenarios.
๐ Abstract
Coffee yields are contingent on the timely and accurate diagnosis of diseases; however, assessing leaf diseases in the field presents significant challenges. Although Artificial Intelligence (AI) vision models achieve high accuracy, their adoption is hindered by the limitations of constrained devices and intermittent connectivity. This study aims to facilitate sustainable on-device diagnosis through knowledge distillation: high-capacity Convolutional Neural Networks (CNNs) trained in data centers transfer knowledge to compact CNNs through Ensemble Learning (EL). Furthermore, dense tiny pairs were integrated through simple and optimized ensembling to enhance accuracy while adhering to strict computational and energy constraints. On a curated coffee leaf dataset, distilled tiny ensembles achieved competitive with prior work with significantly reduced energy consumption and carbon footprint. This indicates that lightweight models, when properly distilled and ensembled, can provide practical diagnostic solutions for Internet of Things (IoT) applications.