🤖 AI Summary
This paper addresses lossy source coding under class-imbalanced settings by pioneering the incorporation of Focal Loss into distortion measurement, yielding a novel coding framework. Theoretically, it derives both a single-shot converse bound and an achievability bound, and rigorously proves that, in the infinite-blocklength limit, its rate-distortion function coincides exactly with that of classical log loss—establishing asymptotic equivalence. Non-asymptotic numerical simulations further quantify the performance gap between the two losses at finite blocklengths. Key contributions are: (1) the first extension of Focal Loss to distortion modeling in source coding; (2) the establishment of the first rigorous rate-distortion theory for Focal Loss-based coding; and (3) a comprehensive characterization of its theoretical properties and practical performance limits, both asymptotically and non-asymptotically.
📝 Abstract
Focal loss has recently gained significant popularity, particularly in tasks like object detection where it helps to address class imbalance by focusing more on hard-to-classify examples. This work proposes the focal loss as a distortion measure for lossy source coding. The paper provides single-shot converse and achievability bounds. These bounds are then used to characterize the distortion-rate trade-off in the infinite blocklength, which is shown to be the same as that for the log loss case. In the non-asymptotic case, the difference between focal loss and log loss is illustrated through a series of simulations.