EDMB: Edge Detector with Mamba

📅 2025-01-08
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses two key challenges in edge detection: the difficulty of jointly modeling long-range dependencies and capturing fine-grained edges, and the inefficiency of learning multi-granularity edge representations under single-label supervision. To this end, we propose EDMB, the first edge detector incorporating the state-space model Mamba into edge detection, enabling efficient global–local feature fusion. We further design a fine-grained perception module and a learnable Gaussian distribution decoder to explicitly model multi-granularity edge responses. Additionally, we introduce an evidence lower bound (ELBO)-based loss function to enable effective supervision of multi-granularity predictions using only single-label ground truth. On BSDS500, EDMB achieves 0.837 ODS for single-granularity and 0.851 ODS for multi-granularity evaluation—without multi-scale testing or external data. Cross-domain generalization is validated on NYUDv2 and BIPED.

Technology Category

Application Category

📝 Abstract
Transformer-based models have made significant progress in edge detection, but their high computational cost is prohibitive. Recently, vision Mamba have shown excellent ability in efficiently capturing long-range dependencies. Drawing inspiration from this, we propose a novel edge detector with Mamba, termed EDMB, to efficiently generate high-quality multi-granularity edges. In EDMB, Mamba is combined with a global-local architecture, therefore it can focus on both global information and fine-grained cues. The fine-grained cues play a crucial role in edge detection, but are usually ignored by ordinary Mamba. We design a novel decoder to construct learnable Gaussian distributions by fusing global features and fine-grained features. And the multi-grained edges are generated by sampling from the distributions. In order to make multi-granularity edges applicable to single-label data, we introduce Evidence Lower Bound loss to supervise the learning of the distributions. On the multi-label dataset BSDS500, our proposed EDMB achieves competitive single-granularity ODS 0.837 and multi-granularity ODS 0.851 without multi-scale test or extra PASCAL-VOC data. Remarkably, EDMB can be extended to single-label datasets such as NYUDv2 and BIPED. The source code is available at https://github.com/Li-yachuan/EDMB.
Problem

Research questions and friction points this paper is trying to address.

Efficient Model Design
Edge and Line Detection
Image Dataset Diversity
Innovation

Methods, ideas, or system contributions that make the work stand out.

EDMB
Integrated Feature Learning
Guided Learning Method
🔎 Similar Papers
No similar papers found.