Quantum Fisher information matrices from Rényi relative entropies

📅 2025-10-02
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the non-uniqueness of quantum extensions of the Fisher information matrix (QFIM) by systematically deriving QFIMs from three distinct Rényi relative entropies: log-Euclidean, α-z, and geometric Rényi divergences. Using a split-difference method for efficient Hessian computation, we uncover the underlying information-geometric structures: the log-Euclidean divergence yields the Kubo–Mori QFIM, while the geometric Rényi divergence yields the right logarithmic derivative (RLD) QFIM—both satisfying the data processing inequality for all α ≥ 0, thereby extending beyond conventional restrictions. We further derive an analytical expression for the α-z QFIM of parameterized thermal states and design a hybrid quantum-classical estimation algorithm compatible with noisy intermediate-scale quantum (NISQ) devices. This framework provides new theoretical tools and practical methodologies for quantum metrology, thermodynamic characterization of quantum systems, and learning in quantum Boltzmann machines.

Technology Category

Application Category

📝 Abstract
Quantum generalizations of the Fisher information are important in quantum information science, with applications in high energy and condensed matter physics and in quantum estimation theory, machine learning, and optimization. One can derive a quantum generalization of the Fisher information matrix in a natural way as the Hessian matrix arising in a Taylor expansion of a smooth divergence. Such an approach is appealing for quantum information theorists, given the ubiquity of divergences in quantum information theory. In contrast to the classical case, there is not a unique quantum generalization of the Fisher information matrix, similar to how there is not a unique quantum generalization of the relative entropy or the Rényi relative entropy. In this paper, I derive information matrices arising from the log-Euclidean, $α$-$z$, and geometric Rényi relative entropies, with the main technical tool for doing so being the method of divided differences for calculating matrix derivatives. Interestingly, for all non-negative values of the Rényi parameter $α$, the log-Euclidean Rényi relative entropy leads to the Kubo-Mori information matrix, and the geometric Rényi relative entropy leads to the right-logarithmic derivative Fisher information matrix. Thus, the resulting information matrices obey the data-processing inequality for all non-negative values of the Rényi parameter $α$ even though the original quantities do not. Additionally, I derive and establish basic properties of $α$-$z$ information matrices resulting from the $α$-$z$ Rényi relative entropies. For parameterized thermal states, I establish formulas for their $α$-$z$ information matrices and hybrid quantum-classical algorithms for estimating them, with applications in quantum Boltzmann machine learning.
Problem

Research questions and friction points this paper is trying to address.

Deriving quantum Fisher information matrices from Rényi relative entropies
Establishing properties of α-z information matrices for quantum states
Developing quantum algorithms for estimating information matrices in machine learning
Innovation

Methods, ideas, or system contributions that make the work stand out.

Derives quantum Fisher information matrices from Rényi relative entropies
Uses divided differences method for calculating matrix derivatives
Establishes formulas for α-z information matrices in thermal states
🔎 Similar Papers
No similar papers found.