Enhance Learning Efficiency of Oblique Decision Tree via Feature Concatenation

📅 2025-02-01
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing oblique decision trees (ODTs) suffer from low learning efficiency under shallow architectures and weak generalization due to non-transferable linear projections across decision paths, leading to parameter redundancy. To address this, we propose FC-ODT—a path-level feature-concatenated ODT that enables *transferable* linear projections along decision paths for the first time. Its core innovation lies in shared, cumulative path-dependent feature transformations: each node inherits and refines representations built incrementally along its root-to-node path. We theoretically establish that FC-ODT achieves a faster consistency convergence rate than standard ODTs. Empirically, under strict depth constraints, FC-ODT consistently outperforms state-of-the-art decision tree methods in accuracy, training efficiency, and generalization—achieving an optimal trade-off between model compactness and expressive power.

Technology Category

Application Category

📝 Abstract
Oblique Decision Tree (ODT) separates the feature space by linear projections, as opposed to the conventional Decision Tree (DT) that forces axis-parallel splits. ODT has been proven to have a stronger representation ability than DT, as it provides a way to create shallower tree structures while still approximating complex decision boundaries. However, its learning efficiency is still insufficient, since the linear projections cannot be transmitted to the child nodes, resulting in a waste of model parameters. In this work, we propose an enhanced ODT method with Feature Concatenation ( exttt{FC-ODT}), which enables in-model feature transformation to transmit the projections along the decision paths. Theoretically, we prove that our method enjoys a faster consistency rate w.r.t. the tree depth, indicating that our method possesses a significant advantage in generalization performance, especially for shallow trees. Experiments show that exttt{FC-ODT} can outperform the other state-of-the-art decision trees with a limited tree depth.
Problem

Research questions and friction points this paper is trying to address.

Oblique Decision Trees
Learning Efficiency
Complex Problem Handling
Innovation

Methods, ideas, or system contributions that make the work stand out.

FC-ODT
Improved Learning Efficiency
Enhanced Model Performance
🔎 Similar Papers
No similar papers found.
Shen-Huan Lyu
Shen-Huan Lyu
Hohai University
Artificial IntelligenceMachine LearningData Mining
Yi-Xiao He
Yi-Xiao He
Nanjing University of Chinese Medicine
Machine LearningData Mining
Y
Yanyan Wang
Key Laboratory of Water Big Data Technology of Ministry of Water Resources, College of Computer Science and Software Engineering, Hohai University, Nanjing, China
Z
Zhihao Qu
Key Laboratory of Water Big Data Technology of Ministry of Water Resources, College of Computer Science and Software Engineering, Hohai University, Nanjing, China
B
Bin Tang
Key Laboratory of Water Big Data Technology of Ministry of Water Resources, College of Computer Science and Software Engineering, Hohai University, Nanjing, China
Baoliu Ye
Baoliu Ye
Associate Professor of Computer Science, Nanjing University, China
Wireless NetworkDistributed Computing