🤖 AI Summary
To address the slow inference speed and limited parallelizability of traditional linear-chain CRFs in sequence labeling, this paper proposes the Bregman Conditional Random Field (BCRF). BCRF is the first structured prediction model to incorporate iterative Bregman projections, enabling fully parallel graphical inference. Integrated with the Fenchel–Young loss, it supports end-to-end differentiable training and partial-label learning. Empirically, BCRF achieves several-fold inference speedup over standard CRFs while maintaining comparable accuracy; it also outperforms parallel baselines—such as mean-field approximation—under resource-constrained settings. The core contributions are: (i) a novel parallelizable structured inference framework grounded in Bregman divergence; and (ii) a differentiable projection-based iteration mechanism jointly optimized with structured output prediction. This paradigm bridges efficient parallel computation and principled probabilistic modeling for structured prediction.
📝 Abstract
We propose a novel discriminative model for sequence labeling called Bregman conditional random fields (BCRF). Contrary to standard linear-chain conditional random fields, BCRF allows fast parallelizable inference algorithms based on iterative Bregman projections. We show how such models can be learned using Fenchel-Young losses, including extension for learning from partial labels. Experimentally, our approach delivers comparable results to CRF while being faster, and achieves better results in highly constrained settings compared to mean field, another parallelizable alternative.