Adaptive Forests For Classification

📅 2025-10-27
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Traditional ensemble methods (e.g., Random Forest, XGBoost) assign uniform weights to all CART trees, ignoring their input-dependent discriminative capabilities—thereby limiting classification performance. This paper proposes Adaptive Forest (AF), an ensemble framework that assigns context-aware, non-uniform weights to individual trees via an input-dependent dynamic weighting mechanism. Its core innovation lies in jointly leveraging Optimal Prediction Policy Trees (OP²T) and Mixed-Integer Optimization (MIO) to achieve both interpretability and global optimality in weight assignment. Evaluated on over 20 real-world datasets across binary and multi-class classification tasks, AF consistently outperforms standard baselines and state-of-the-art weighted ensemble methods. Empirical results demonstrate that input-adaptive weighting significantly enhances generalization performance, validating the efficacy of dynamically calibrated tree contributions in ensemble learning.

Technology Category

Application Category

📝 Abstract
Random Forests (RF) and Extreme Gradient Boosting (XGBoost) are two of the most widely used and highly performing classification and regression models. They aggregate equally weighted CART trees, generated randomly in RF or sequentially in XGBoost. In this paper, we propose Adaptive Forests (AF), a novel approach that adaptively selects the weights of the underlying CART models. AF combines (a) the Optimal Predictive-Policy Trees (OP2T) framework to prescribe tailored, input-dependent unequal weights to trees and (b) Mixed Integer Optimization (MIO) to refine weight candidates dynamically, enhancing overall performance. We demonstrate that AF consistently outperforms RF, XGBoost, and other weighted RF in binary and multi-class classification problems over 20+ real-world datasets.
Problem

Research questions and friction points this paper is trying to address.

Adaptively selects weights for CART trees in ensembles
Enhances classification performance using input-dependent tree weighting
Outperforms RF and XGBoost across diverse real-world datasets
Innovation

Methods, ideas, or system contributions that make the work stand out.

Adaptively selects weights for CART trees
Uses OP2T framework for input-dependent weights
Employs MIO to refine weights dynamically
🔎 Similar Papers
No similar papers found.
Dimitris Bertsimas
Dimitris Bertsimas
Boeing Professor of Operations Research, MIT
Operations ResearchOptimizationStochasticsAnalyticsHealth Care
Y
Yubing Cui
Operations Research Center, Massachusetts Institute of Technology, Cambridge, MA 02139, USA