Enhance and Reuse: A Dual-Mechanism Approach to Boost Deep Forest for Label Distribution Learning

📅 2026-02-06
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the insufficient exploitation of inter-label correlations in label distribution learning by proposing a deep forest framework that explicitly incorporates label dependencies. The approach introduces a dual mechanism: first, it enhances original features through label correlation modeling to explicitly capture label associations; second, it employs a metric-aware feature reuse strategy that dynamically reuses effective features for samples exhibiting performance degradation on the validation set, thereby mitigating noise propagation and stabilizing training. Extensive experiments across multiple benchmark datasets demonstrate that the proposed method consistently outperforms existing state-of-the-art algorithms across all six evaluation metrics, significantly improving the accuracy and robustness of label distribution prediction.

Technology Category

Application Category

📝 Abstract
Label distribution learning (LDL) requires the learner to predict the degree of correlation between each sample and each label. To achieve this, a crucial task during learning is to leverage the correlation among labels. Deep Forest (DF) is a deep learning framework based on tree ensembles, whose training phase does not rely on backpropagation. DF performs in-model feature transform using the prediction of each layer and achieves competitive performance on many tasks. However, its exploration in the field of LDL is still in its infancy. The few existing methods that apply DF to the field of LDL do not have effective ways to utilize the correlation among labels. Therefore, we propose a method named Enhanced and Reused Feature Deep Forest (ERDF). It mainly contains two mechanisms: feature enhancement exploiting label correlation and measure-aware feature reuse. The first one is to utilize the correlation among labels to enhance the original features, enabling the samples to acquire more comprehensive information for the task of LDL. The second one performs a reuse operation on the features of samples that perform worse than the previous layer on the validation set, in order to ensure the stability of the training process. This kind of Enhance-Reuse pattern not only enables samples to enrich their features but also validates the effectiveness of their new features and conducts a reuse process to prevent the noise from spreading further. Experiments show that our method outperforms other comparison algorithms on six evaluation metrics.
Problem

Research questions and friction points this paper is trying to address.

Label Distribution Learning
Deep Forest
Label Correlation
Feature Enhancement
Model Stability
Innovation

Methods, ideas, or system contributions that make the work stand out.

Label Distribution Learning
Deep Forest
Label Correlation
Feature Enhancement
Feature Reuse
🔎 Similar Papers
No similar papers found.
J
Jia-Le Xu
Key Laboratory of Water Big Data Technology of Ministry of Water Resources, Hohai University, Nanjing 211100, China; College of Computer Science and Software Engineering, Hohai University, Nanjing 211100, China
Shen-Huan Lyu
Shen-Huan Lyu
Hohai University
Artificial IntelligenceMachine LearningData Mining
Y
Yu-Nian Wang
Key Laboratory of Water Big Data Technology of Ministry of Water Resources, Hohai University, Nanjing 211100, China; College of Computer Science and Software Engineering, Hohai University, Nanjing 211100, China
N
Ning Chen
Key Laboratory of Water Big Data Technology of Ministry of Water Resources, Hohai University, Nanjing 211100, China; College of Computer Science and Software Engineering, Hohai University, Nanjing 211100, China
Z
Zhihao Qu
Key Laboratory of Water Big Data Technology of Ministry of Water Resources, Hohai University, Nanjing 211100, China; College of Computer Science and Software Engineering, Hohai University, Nanjing 211100, China
Bin Tang
Bin Tang
Professor, Hohai University
Edge ComputingNetwork Coding
Baoliu Ye
Baoliu Ye
Associate Professor of Computer Science, Nanjing University, China
Wireless NetworkDistributed Computing