Hydra: Accurate Multi-Modal Leaf Wetness Sensing with mm-Wave and Camera Fusion

📅 2025-08-04
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing leaf wetness duration (LWD) detection methods lack standardization and exhibit poor adaptability to real plants and dynamic agricultural environments, resulting in limited robustness and generalizability. To address this, we propose a multimodal fusion approach integrating 76–81 GHz frequency-modulated continuous-wave (FMCW) millimeter-wave radar with an RGB camera. We design a CNN-Transformer hybrid architecture featuring a selective feature fusion mechanism, enabling the first cross-environmentally stable perception of natural leaf-surface water films and precise LWD estimation. Leveraging deep cross-modal modeling between radar-derived depth maps and RGB imagery—combined with targeted data augmentation—the model achieves 90–96% detection accuracy across diverse real-world farm conditions, including rainy, dawn, and low-light nighttime scenarios. This advancement significantly enhances the practicality, reliability, and environmental robustness of agricultural LWD monitoring.

Technology Category

Application Category

📝 Abstract
Leaf Wetness Duration (LWD), the time that water remains on leaf surfaces, is crucial in the development of plant diseases. Existing LWD detection lacks standardized measurement techniques, and variations across different plant characteristics limit its effectiveness. Prior research proposes diverse approaches, but they fail to measure real natural leaves directly and lack resilience in various environmental conditions. This reduces the precision and robustness, revealing a notable practical application and effectiveness gap in real-world agricultural settings. This paper presents Hydra, an innovative approach that integrates millimeter-wave (mm-Wave) radar with camera technology to detect leaf wetness by determining if there is water on the leaf. We can measure the time to determine the LWD based on this detection. Firstly, we design a Convolutional Neural Network (CNN) to selectively fuse multiple mm-Wave depth images with an RGB image to generate multiple feature images. Then, we develop a transformer-based encoder to capture the inherent connection among the multiple feature images to generate a feature map, which is further fed to a classifier for detection. Moreover, we augment the dataset during training to generalize our model. Implemented using a frequency-modulated continuous-wave (FMCW) radar within the 76 to 81 GHz band, Hydra's performance is meticulously evaluated on plants, demonstrating the potential to classify leaf wetness with up to 96% accuracy across varying scenarios. Deploying Hydra in the farm, including rainy, dawn, or poorly light nights, it still achieves an accuracy rate of around 90%.
Problem

Research questions and friction points this paper is trying to address.

Lacks standardized leaf wetness measurement techniques
Existing methods fail in diverse environmental conditions
Low precision and robustness in real-world agriculture
Innovation

Methods, ideas, or system contributions that make the work stand out.

Fuses mm-Wave radar with camera for leaf wetness
Uses CNN and transformer for feature fusion
Achieves high accuracy in diverse conditions
🔎 Similar Papers
No similar papers found.
Yimeng Liu
Yimeng Liu
University of California, Santa Barbara
Human-Computer InteractionHuman-AI InteractionHuman-Centered AI
M
Maolin Gan
Michigan State University, USA
H
Huaili Zeng
Michigan State University, USA
L
Li Liu
Tsinghua University, China
Y
Younsuk Dong
Michigan State University, USA
Z
Zhichao Cao
Michigan State University, USA