Learning to Infer Parameterized Representations of Plants from 3D Scans

๐Ÿ“… 2025-05-28
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
Reconstructing parametric tree-like structures from plant 3D scan point clouds remains challenging due to organ density, severe self-occlusion, and tight coupling between topology and geometry. Method: This paper introduces the first end-to-end, data-driven framework that directly learns a differentiable, parametric binary axial tree representation from unstructured point cloudsโ€”jointly modeling branch topology, organ geometry, and semantic segmentation. Leveraging synthetic plant data generated via L-systems, a recurrent neural network is trained to map raw point clouds to structured parameters, including branch angles, lengths, radii, and organ types. Contribution/Results: The learned representation natively supports downstream tasks such as reconstruction, skeletonization, and part segmentation. Evaluated on real-world 3D scans of *Chenopodium album*, the method achieves state-of-the-art reconstruction fidelity and organ parameter accuracy, significantly outperforming conventional multi-stage approaches.

Technology Category

Application Category

๐Ÿ“ Abstract
Reconstructing faithfully the 3D architecture of plants from unstructured observations is a challenging task. Plants frequently contain numerous organs, organized in branching systems in more or less complex spatial networks, leading to specific computational issues due to self-occlusion or spatial proximity between organs. Existing works either consider inverse modeling where the aim is to recover the procedural rules that allow to simulate virtual plants, or focus on specific tasks such as segmentation or skeletonization. We propose a unified approach that, given a 3D scan of a plant, allows to infer a parameterized representation of the plant. This representation describes the plant's branching structure, contains parametric information for each plant organ, and can therefore be used directly in a variety of tasks. In this data-driven approach, we train a recursive neural network with virtual plants generated using an L-systems-based procedural model. After training, the network allows to infer a parametric tree-like representation based on an input 3D point cloud. Our method is applicable to any plant that can be represented as binary axial tree. We evaluate our approach on Chenopodium Album plants, using experiments on synthetic plants to show that our unified framework allows for different tasks including reconstruction, segmentation and skeletonization, while achieving results on-par with state-of-the-art for each task.
Problem

Research questions and friction points this paper is trying to address.

Reconstruct 3D plant architecture from unstructured scans
Infer parameterized plant representations for multiple tasks
Handle complex plant structures with self-occlusion issues
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses recursive neural network for plant representation
Infers parametric tree-like structure from 3D scans
Trains with L-systems-generated virtual plants
๐Ÿ”Ž Similar Papers
No similar papers found.