Back to Bits: Extending Shannon's communication performance framework to computing

📅 2025-08-07
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Conventional metrics such as floating-point operations per second (FLOPs) fail to capture the intrinsic performance characteristics of emerging computing paradigms—including low-precision, analog, quantum, and reversible logic—due to their hardware- and precision-specific assumptions. Method: This paper proposes a general, information-theoretic framework for computational performance evaluation, modeling computation as an information-transformation channel from input to output and using mutual information as the core metric to quantify a system’s capacity to encode, process, and preserve semantically meaningful information. Contribution/Results: It is the first work to systematically integrate Shannon’s mutual information into computational performance assessment, thereby decoupling evaluation from underlying hardware implementations and numerical representations. The framework enables paradigm-agnostic, implementation-independent performance analysis across heterogeneous computing models. It establishes a foundational theoretical basis and provides a scalable, principled metric for rigorously assessing both the effectiveness and efficiency of next-generation heterogeneous computing systems.

Technology Category

Application Category

📝 Abstract
This work proposes a novel computing performance unit grounded in information theory. Modern computing systems are increasingly diverse, supporting low-precision formats, hardware specialization, and emerging paradigms such as analog, quantum, and reversible logic. Traditional metrics like floating-point operations (flops) no longer accurately capture this complexity. We frame computing as the transformation of information through a channel and define performance in terms of the mutual information between a system's inputs and outputs. This approach measures not just the quantity of data processed, but the amount of meaningful information encoded, manipulated, and retained through computation. Our framework provides a principled, implementation-agnostic foundation for evaluating performance.
Problem

Research questions and friction points this paper is trying to address.

Extend Shannon's framework to computing performance
Address limitations of traditional metrics like flops
Measure meaningful information in diverse computing systems
Innovation

Methods, ideas, or system contributions that make the work stand out.

Novel computing performance unit from information theory
Measures mutual information between inputs and outputs
Implementation-agnostic framework for diverse computing systems
🔎 Similar Papers
2024-06-07International Symposium on High-Performance Computer ArchitectureCitations: 5
M
Max Hawkins
Georgia Institute of Technology, Atlanta, Georgia, USA
Richard Vuduc
Richard Vuduc
Georgia Institute of Technology
high performance computingparallel computing