🤖 AI Summary
Conventional metrics such as floating-point operations per second (FLOPs) fail to capture the intrinsic performance characteristics of emerging computing paradigms—including low-precision, analog, quantum, and reversible logic—due to their hardware- and precision-specific assumptions.
Method: This paper proposes a general, information-theoretic framework for computational performance evaluation, modeling computation as an information-transformation channel from input to output and using mutual information as the core metric to quantify a system’s capacity to encode, process, and preserve semantically meaningful information.
Contribution/Results: It is the first work to systematically integrate Shannon’s mutual information into computational performance assessment, thereby decoupling evaluation from underlying hardware implementations and numerical representations. The framework enables paradigm-agnostic, implementation-independent performance analysis across heterogeneous computing models. It establishes a foundational theoretical basis and provides a scalable, principled metric for rigorously assessing both the effectiveness and efficiency of next-generation heterogeneous computing systems.
📝 Abstract
This work proposes a novel computing performance unit grounded in information theory. Modern computing systems are increasingly diverse, supporting low-precision formats, hardware specialization, and emerging paradigms such as analog, quantum, and reversible logic. Traditional metrics like floating-point operations (flops) no longer accurately capture this complexity. We frame computing as the transformation of information through a channel and define performance in terms of the mutual information between a system's inputs and outputs. This approach measures not just the quantity of data processed, but the amount of meaningful information encoded, manipulated, and retained through computation. Our framework provides a principled, implementation-agnostic foundation for evaluating performance.