🤖 AI Summary
Remote sensing struggles to directly quantify crop physiological stress indicators. To address this limitation for precision agriculture, this study proposes a synergistic approach integrating leaf-surface in-situ colorimetric sensing with ground robot–based hyperspectral dynamic imaging. We introduce a novel integrated architecture combining conformal colorimetric sensors with an autonomous robotic platform, enabling field-deployable visual sensor detection, adaptive positioning, controlled-illumination hyperspectral reflectance acquisition, and interpretable spectral analysis. The system integrates monocular RGB vision-based localization, YOLO object detection, motorized mirror control, halogen illumination compensation, and push-broom hyperspectral imaging. Indoor and outdoor experiments demonstrate fully autonomous operation—including navigation, sensor identification, hyperspectral data acquisition, and feature extraction—with field-collected spectra showing excellent agreement (R² > 0.98) against laboratory-grade spectrometer measurements. This framework significantly enhances the directness, real-time capability, and field applicability of crop health monitoring.
📝 Abstract
Current remote sensing technologies that measure crop health e.g. RGB, multispectral, hyperspectral, and LiDAR, are indirect, and cannot capture plant stress indicators directly. Instead, low-cost leaf sensors that directly interface with the crop surface present an opportunity to advance real-time direct monitoring. To this end, we co-design a sensor-detector system, where the sensor is a novel colorimetric leaf sensor that directly measures crop health in a precision agriculture setting, and the detector autonomously obtains optical signals from these leaf sensors. This system integrates a ground robot platform with an on-board monocular RGB camera and object detector to localize the leaf sensor, and a hyperspectral camera with motorized mirror and an on-board halogen light to acquire a hyperspectral reflectance image of the leaf sensor, from which a spectral response characterizing crop health can be extracted. We show a successful demonstration of our co-designed system operating in outdoor environments, obtaining spectra that are interpretable when compared to controlled laboratory-grade spectrometer measurements. The system is demonstrated in row-crop environments both indoors and outdoors where it is able to autonomously navigate, locate and obtain a hyperspectral image of all leaf sensors present, and retrieve interpretable spectral resonance from leaf sensors.