🤖 AI Summary
This work addresses the challenge of extracting annual ring boundaries from low-quality RGB cross-sectional images of *Pinus taeda* captured by smartphones, where poor contrast, irregular textures, and non-uniform illumination impede accurate segmentation. We propose a two-stage U-Net framework: Stage I performs semantic segmentation to isolate background, pith, and coarse ring regions; Stage II leverages polar coordinate transformation and geometric priors to iteratively refine ring boundaries outward from the pith. To our knowledge, this is the first application of an iterative polar-domain boundary detection paradigm to dendrochronological analysis under in-the-wild (non-laboratory) imaging conditions, markedly improving boundary continuity and localization robustness in heterogeneous textures. Evaluated on the UruDendro dataset, our method achieves an F-Score of 77.5%, mean Average Recall (mAR) of 0.540, and Adjusted Rand Index (ARAND) of 0.205. The source code is publicly available.
📝 Abstract
This work presents the INBD network proposed by Gillert et al. in CVPR-2023 and studies its application for delineating tree rings in RGB images of Pinus taeda cross sections captured by a smartphone (UruDendro dataset), which are images with different characteristics from the ones used to train the method. The INBD network operates in two stages: first, it segments the background, pith, and ring boundaries. In the second stage, the image is transformed into polar coordinates, and ring boundaries are iteratively segmented from the pith to the bark. Both stages are based on the U-Net architecture. The method achieves an F-Score of 77.5, a mAR of 0.540, and an ARAND of 0.205 on the evaluation set. The code for the experiments is available at https://github.com/hmarichal93/mlbrief_inbd.