Robotic Monitoring of Colorimetric Leaf Sensors for Precision Agriculture

📅 2025-05-20
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Remote sensing struggles to directly quantify crop physiological stress indicators. To address this limitation for precision agriculture, this study proposes a synergistic approach integrating leaf-surface in-situ colorimetric sensing with ground robot–based hyperspectral dynamic imaging. We introduce a novel integrated architecture combining conformal colorimetric sensors with an autonomous robotic platform, enabling field-deployable visual sensor detection, adaptive positioning, controlled-illumination hyperspectral reflectance acquisition, and interpretable spectral analysis. The system integrates monocular RGB vision-based localization, YOLO object detection, motorized mirror control, halogen illumination compensation, and push-broom hyperspectral imaging. Indoor and outdoor experiments demonstrate fully autonomous operation—including navigation, sensor identification, hyperspectral data acquisition, and feature extraction—with field-collected spectra showing excellent agreement (R² > 0.98) against laboratory-grade spectrometer measurements. This framework significantly enhances the directness, real-time capability, and field applicability of crop health monitoring.

Technology Category

Application Category

📝 Abstract
Current remote sensing technologies that measure crop health e.g. RGB, multispectral, hyperspectral, and LiDAR, are indirect, and cannot capture plant stress indicators directly. Instead, low-cost leaf sensors that directly interface with the crop surface present an opportunity to advance real-time direct monitoring. To this end, we co-design a sensor-detector system, where the sensor is a novel colorimetric leaf sensor that directly measures crop health in a precision agriculture setting, and the detector autonomously obtains optical signals from these leaf sensors. This system integrates a ground robot platform with an on-board monocular RGB camera and object detector to localize the leaf sensor, and a hyperspectral camera with motorized mirror and an on-board halogen light to acquire a hyperspectral reflectance image of the leaf sensor, from which a spectral response characterizing crop health can be extracted. We show a successful demonstration of our co-designed system operating in outdoor environments, obtaining spectra that are interpretable when compared to controlled laboratory-grade spectrometer measurements. The system is demonstrated in row-crop environments both indoors and outdoors where it is able to autonomously navigate, locate and obtain a hyperspectral image of all leaf sensors present, and retrieve interpretable spectral resonance from leaf sensors.
Problem

Research questions and friction points this paper is trying to address.

Direct measurement of crop health using colorimetric leaf sensors
Autonomous robotic system for real-time plant stress monitoring
Integration of hyperspectral imaging for precise agricultural analysis
Innovation

Methods, ideas, or system contributions that make the work stand out.

Co-designed colorimetric leaf sensor for direct crop health monitoring
Robot with RGB and hyperspectral cameras for autonomous sensor detection
Hyperspectral imaging with motorized mirror for precise spectral analysis
🔎 Similar Papers
No similar papers found.
M
Malakhi Hopkins
GRASP Laboratory, University of Pennsylvania, Pennsylvania, USA
A
Alice Kate Li
GRASP Laboratory, University of Pennsylvania, Pennsylvania, USA
S
Shobhita Kramadhati
GRASP Laboratory, University of Pennsylvania, Pennsylvania, USA
J
Jackson Arnold
University of Florida, Florida, USA
A
Akhila Mallavarapu
GRASP Laboratory, University of Pennsylvania, Pennsylvania, USA
C
C. Lawrence
GRASP Laboratory, University of Pennsylvania, Pennsylvania, USA
Varun Murali
Varun Murali
Assistant Professor, Texas A&M University
Decision MakingComputer VisionAutonomous SystemsMachine LearningNavigation
S
Sanjeev J. Koppal
University of Florida, Florida, USA
Cherie Kagan
Cherie Kagan
University of Pennsylvania
V
Vijay Kumar
GRASP Laboratory, University of Pennsylvania, Pennsylvania, USA