AgriChrono: A Multi-modal Dataset Capturing Crop Growth and Lighting Variability with a Field Robot

📅 2025-08-26
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing agricultural datasets are predominantly collected in static, controlled environments using single-modality sensors and short temporal sequences, failing to capture real-world field dynamics—including diurnal illumination variations, crop growth progression, and natural disturbances—thereby limiting model generalizability. To address this, we introduce the first multimodal, long-term dataset specifically designed for dynamic outdoor farmland. Acquired via an autonomous field robot, it synchronously captures RGB, depth, LiDAR, and IMU data across full daylight cycles and complete crop growth stages. The platform enables remote operation, sub-millisecond hardware time synchronization, and repeatable field deployments, and includes a standardized 3D reconstruction benchmark. Both dataset and code are publicly released. Experimental evaluation reveals substantial performance degradation of state-of-the-art 3D reconstruction models under real-field conditions, empirically validating the dataset’s critical role in advancing robustness and generalization of agricultural perception systems.

Technology Category

Application Category

📝 Abstract
Existing datasets for precision agriculture have primarily been collected in static or controlled environments such as indoor labs or greenhouses, often with limited sensor diversity and restricted temporal span. These conditions fail to reflect the dynamic nature of real farmland, including illumination changes, crop growth variation, and natural disturbances. As a result, models trained on such data often lack robustness and generalization when applied to real-world field scenarios. In this paper, we present AgriChrono, a novel robotic data collection platform and multi-modal dataset designed to capture the dynamic conditions of real-world agricultural environments. Our platform integrates multiple sensors and enables remote, time-synchronized acquisition of RGB, Depth, LiDAR, and IMU data, supporting efficient and repeatable long-term data collection across varying illumination and crop growth stages. We benchmark a range of state-of-the-art 3D reconstruction models on the AgriChrono dataset, highlighting the difficulty of reconstruction in real-world field environments and demonstrating its value as a research asset for advancing model generalization under dynamic conditions. The code and dataset are publicly available at: https://github.com/StructuresComp/agri-chrono
Problem

Research questions and friction points this paper is trying to address.

Captures dynamic crop growth and lighting variability
Addresses limited sensor diversity in agricultural datasets
Improves model robustness for real-world field conditions
Innovation

Methods, ideas, or system contributions that make the work stand out.

Robotic platform for multi-sensor data collection
Time-synchronized RGB, Depth, LiDAR, IMU acquisition
Long-term monitoring across illumination and growth stages
🔎 Similar Papers
No similar papers found.
Jaehwan Jeong
Jaehwan Jeong
Samsung Electronics
AI SecurityComputer Security
T
Tuan-Anh Vu
Department of Mechanical & Aerospace Engineering, University of California, Los Angeles, CA 90095, USA.
M
Mohammad Jony
Department of Plant Sciences, North Dakota State University, Fargo, ND 58102, USA.
Shahab Ahmad
Shahab Ahmad
Department of Plant Sciences, North Dakota State University, Fargo, ND 58102, USA.
M
Md. Mukhlesur Rahman
Department of Plant Sciences, North Dakota State University, Fargo, ND 58102, USA.
Sangpil Kim
Sangpil Kim
Korea University
Computer Vision
M. Khalid Jawed
M. Khalid Jawed
UCLA (Structures-Computer Interaction Lab)
Solid and structural mechanicsroboticsphysics-assisted machine learning