MCVI-SANet: A lightweight semi-supervised model for LAI and SPAD estimation of winter wheat under vegetation index saturation

📅 2025-12-20
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address low estimation accuracy of leaf area index (LAI) and SPAD values in winter wheat under dense canopy conditions—caused by vegetation index (VI) saturation and scarcity of ground-truth annotations—this paper proposes MCVI-SANet, a lightweight semi-supervised vision model. Its core innovation is the Vegetation Index Saturation-Aware Block (VI-SABlock), which integrates agronomic priors with the VICReg semi-supervised learning strategy to enable channel- and spatial-wise adaptive feature enhancement and robust generalization from limited samples. The model further incorporates multi-channel VI fusion, plant-height-guided data partitioning, and a lightweight CNN backbone. Experiments demonstrate significant improvements: LAI estimation achieves R² = 0.8123 (+8.95% over baselines) and RMSE = 0.4796; SPAD estimation attains R² = 0.6846 (+8.17%) and RMSE = 2.4222. With only 0.10 million parameters, MCVI-SANet delivers both high accuracy and efficient inference.

Technology Category

Application Category

📝 Abstract
Vegetation index (VI) saturation during the dense canopy stage and limited ground-truth annotations of winter wheat constrain accurate estimation of LAI and SPAD. Existing VI-based and texture-driven machine learning methods exhibit limited feature expressiveness. In addition, deep learning baselines suffer from domain gaps and high data demands, which restrict their generalization. Therefore, this study proposes the Multi-Channel Vegetation Indices Saturation Aware Net (MCVI-SANet), a lightweight semi-supervised vision model. The model incorporates a newly designed Vegetation Index Saturation-Aware Block (VI-SABlock) for adaptive channel-spatial feature enhancement. It also integrates a VICReg-based semi-supervised strategy to further improve generalization. Datasets were partitioned using a vegetation height-informed strategy to maintain representativeness across growth stages. Experiments over 10 repeated runs demonstrate that MCVI-SANet achieves state-of-the-art accuracy. The model attains an average R2 of 0.8123 and RMSE of 0.4796 for LAI, and an average R2 of 0.6846 and RMSE of 2.4222 for SPAD. This performance surpasses the best-performing baselines, with improvements of 8.95% in average LAI R2 and 8.17% in average SPAD R2. Moreover, MCVI-SANet maintains high inference speed with only 0.10M parameters. Overall, the integration of semi-supervised learning with agronomic priors provides a promising approach for enhancing remote sensing-based precision agriculture.
Problem

Research questions and friction points this paper is trying to address.

Estimates LAI and SPAD of winter wheat despite vegetation index saturation.
Addresses limited feature expressiveness in existing VI and texture methods.
Overcomes domain gaps and high data demands of deep learning models.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Lightweight semi-supervised vision model for crop estimation
Vegetation Index Saturation-Aware Block for adaptive feature enhancement
VICReg-based semi-supervised strategy to improve generalization
🔎 Similar Papers
No similar papers found.
Z
Zhiheng Zhang
College of Information Science and Engineering, Shandong Agricultural University, Taian 271018, China
J
Jiajun Yang
College of Information Science and Engineering, Shandong Agricultural University, Taian 271018, China
Hong Sun
Hong Sun
LLNL
Machine learningMaterials science
D
Dong Wang
College of Agriculture, Northwest Agriculture and Forestry University, Yangling 712100, China
H
Honghua Jiang
College of Information Science and Engineering, Shandong Agricultural University, Taian 271018, China
Yaru Chen
Yaru Chen
Centre for Vision Speech and Signal Processing (CVSSP), University of Surrey
Multi-modal learningComputer vision
T
Tangyuan Ning
College of Information Science and Engineering, Shandong Agricultural University, Taian 271018, China