DFYP: A Dynamic Fusion Framework with Spectral Channel Attention and Adaptive Operator learning for Crop Yield Prediction

๐Ÿ“… 2025-07-08
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
Remote sensing-based crop yield prediction faces challenges including complex spatial patterns, strong spectral heterogeneity, and poor generalization across crops and years. To address these, this paper proposes a dynamic multi-source fusion framework. Methodologically, we introduce a resolution-aware channel attention (RCA) module to model multi-scale spectral responses; design an edge-adaptive operator learning network (AOL-Net) to enhance spatial structural representation; and incorporate a learnable dual-branch fusion mechanism to enable cross-resolution synergistic modeling of MODIS and Sentinel-2 data. Extensive experiments on a real-world, multi-temporal, multi-crop dataset demonstrate that our approach consistently outperforms state-of-the-art methods, achieving superior performance in RMSE, MAE, and Rยฒ. The results validate its strong generalizability across crops and years, robustness to spectral and spatial variability, and practical utility for operational agricultural monitoring.

Technology Category

Application Category

๐Ÿ“ Abstract
Accurate remote sensing-based crop yield prediction remains a fundamental challenging task due to complex spatial patterns, heterogeneous spectral characteristics, and dynamic agricultural conditions. Existing methods often suffer from limited spatial modeling capacity, weak generalization across crop types and years. To address these challenges, we propose DFYP, a novel Dynamic Fusion framework for crop Yield Prediction, which combines spectral channel attention, edge-adaptive spatial modeling and a learnable fusion mechanism to improve robustness across diverse agricultural scenarios. Specifically, DFYP introduces three key components: (1) a Resolution-aware Channel Attention (RCA) module that enhances spectral representation by adaptively reweighting input channels based on resolution-specific characteristics; (2) an Adaptive Operator Learning Network (AOL-Net) that dynamically selects operators for convolutional kernels to improve edge-sensitive spatial feature extraction under varying crop and temporal conditions; and (3) a dual-branch architecture with a learnable fusion mechanism, which jointly models local spatial details and global contextual information to support cross-resolution and cross-crop generalization. Extensive experiments on multi-year datasets MODIS and multi-crop dataset Sentinel-2 demonstrate that DFYP consistently outperforms current state-of-the-art baselines in RMSE, MAE, and R2 across different spatial resolutions, crop types, and time periods, showcasing its effectiveness and robustness for real-world agricultural monitoring.
Problem

Research questions and friction points this paper is trying to address.

Improving crop yield prediction accuracy in dynamic agricultural conditions
Enhancing spectral and spatial feature extraction for diverse crops
Addressing generalization across different resolutions and crop types
Innovation

Methods, ideas, or system contributions that make the work stand out.

Resolution-aware Channel Attention enhances spectral representation
Adaptive Operator Learning improves edge-sensitive feature extraction
Dual-branch architecture models local and global information
๐Ÿ”Ž Similar Papers
No similar papers found.
J
Juli Zhang
School of Computer Science and Technology, Xidian University, Xiโ€™an, 710065, China; School of Computing, Australian National University, Canberra, ACT 2601, Australia
Zeyu Yan
Zeyu Yan
PhD, Computer Science, University of Maryland, College Park
Human Computer InteractionDigital FabricationSustainabilityInteraction systemsAccessibility
J
Jing Zhang
School of Computing, Australian National University, Canberra, ACT 2601, Australia
Q
Qiguang Miao
School of Computer Science and Technology, Xidian University, Xiโ€™an, 710065, China
Q
Quan Wang
School of Computer Science and Technology, Xidian University, Xiโ€™an, 710065, China