NexusFlow: Unifying Disparate Tasks under Partial Supervision via Invertible Flow Networks

📅 2025-12-05
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the challenge of jointly training structurally heterogeneous tasks—such as classification, detection, and segmentation—under partial supervision in multi-task learning (PS-MTL), this paper proposes the first unified PS-MTL framework tailored for structurally diverse tasks. Methodologically, it introduces (1) reversible coupling layers to align latent feature distributions across heterogeneous tasks while preserving information integrity and preventing representation collapse; and (2) a lightweight surrogate network built upon invertible normalizing flows, which employs bijective mappings to project task-specific features into a shared canonical space, enabling plug-and-play cross-task knowledge transfer. Evaluated on nuScenes, the method surpasses strong baselines and establishes a new state-of-the-art. On the NYUv2 multi-task benchmark, it consistently improves performance across all constituent tasks, demonstrating both broad applicability and effectiveness.

Technology Category

Application Category

📝 Abstract
Partially Supervised Multi-Task Learning (PS-MTL) aims to leverage knowledge across tasks when annotations are incomplete. Existing approaches, however, have largely focused on the simpler setting of homogeneous, dense prediction tasks, leaving the more realistic challenge of learning from structurally diverse tasks unexplored. To this end, we introduce NexusFlow, a novel, lightweight, and plug-and-play framework effective in both settings. NexusFlow introduces a set of surrogate networks with invertible coupling layers to align the latent feature distributions of tasks, creating a unified representation that enables effective knowledge transfer. The coupling layers are bijective, preserving information while mapping features into a shared canonical space. This invertibility avoids representational collapse and enables alignment across structurally different tasks without reducing expressive capacity. We first evaluate NexusFlow on the core challenge of domain-partitioned autonomous driving, where dense map reconstruction and sparse multi-object tracking are supervised in different geographic regions, creating both structural disparity and a strong domain gap. NexusFlow sets a new state-of-the-art result on nuScenes, outperforming strong partially supervised baselines. To demonstrate generality, we further test NexusFlow on NYUv2 using three homogeneous dense prediction tasks, segmentation, depth, and surface normals, as a representative N-task PS-MTL scenario. NexusFlow yields consistent gains across all tasks, confirming its broad applicability.
Problem

Research questions and friction points this paper is trying to address.

Addresses learning from structurally diverse tasks with incomplete annotations.
Aligns latent feature distributions across tasks for effective knowledge transfer.
Handles both homogeneous dense prediction and structurally disparate task scenarios.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Invertible flow networks align latent feature distributions
Bijective coupling layers preserve information in shared space
Lightweight plug-and-play framework for structurally diverse tasks
🔎 Similar Papers
No similar papers found.