A Quantitative Evaluation of the Expressivity of BMI, Pose and Gender in Body Embeddings for Recognition and Identification

📅 2025-03-09
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses representation bias in person re-identification (ReID) induced by sensitive attributes—specifically body mass index (BMI), pose (pitch/yaw), and gender. We propose *expressivity*, a novel mutual information–based metric that quantifies the degree to which sensitive attributes are encoded in human embeddings—a first systematic characterization of such bias in ReID representations. Methodologically, we employ auxiliary neural networks to estimate mutual information, integrated within the SemReID self-supervised framework, and conduct hierarchical and temporal dynamic analyses to track expressivity evolution across network depth and training epochs. Key findings reveal that BMI exhibits significantly higher expressivity in final-layer attention mechanisms than pose or gender, establishing it as the dominant source of representation bias in ReID. This work provides an interpretable, quantitative theoretical tool and empirical foundation for bias diagnosis and debiasing design in ReID systems.

Technology Category

Application Category

📝 Abstract
Person Re-identification (ReID) systems identify individuals across images or video frames and play a critical role in various real-world applications. However, many ReID methods are influenced by sensitive attributes such as gender, pose, and body mass index (BMI), which vary in uncontrolled environments, leading to biases and reduced generalization. To address this, we extend the concept of expressivity to the body recognition domain to better understand how ReID models encode these attributes. Expressivity, defined as the mutual information between feature vector representations and specific attributes, is computed using a secondary neural network that takes feature and attribute vectors as inputs. This provides a quantitative framework for analyzing the extent to which sensitive attributes are embedded in the model's representations. We apply expressivity analysis to SemReID, a state-of-the-art self-supervised ReID model, and find that BMI consistently exhibits the highest expressivity scores in the model's final layers, underscoring its dominant role in feature encoding. In the final attention layer of the trained network, the expressivity order for body attributes is BMI>Pitch>Yaw>Gender, highlighting their relative importance in learned representations. Additionally, expressivity values evolve progressively across network layers and training epochs, reflecting a dynamic encoding of attributes during feature extraction. These insights emphasize the influence of body-related attributes on ReID models and provide a systematic methodology for identifying and mitigating attribute-driven biases. By leveraging expressivity analysis, we offer valuable tools to enhance the fairness, robustness, and generalization of ReID systems in diverse real-world settings.
Problem

Research questions and friction points this paper is trying to address.

Quantify how BMI, pose, and gender influence body recognition models.
Analyze expressivity of sensitive attributes in ReID feature encoding.
Mitigate biases in ReID systems using expressivity analysis.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Expressivity analysis quantifies attribute encoding in ReID models.
Secondary neural network computes mutual information for attributes.
Dynamic attribute encoding observed across network layers and epochs.