🤖 AI Summary
In high-density qubit systems, strong inter-qubit interactions induce complex many-body dynamics, causing multi-peak oscillations in magnetic field sensing responses and severely limiting dynamic range.
Method: We propose the first dynamic-range calibration framework based on parametrized quantum circuit learning. It models the expectation-value response via trainable quantum gate sequences and jointly optimizes gate parameters using gradient-based methods to enforce strict monotonicity of the measurement signal over the target field range, thereby eliminating oscillatory ambiguities.
Contribution/Results: This work pioneers the use of quantum circuit learning for dynamic-range extension in quantum metrology. While preserving spatial resolution, it enables wide-range (>10× improvement), single-valued, and high-fidelity magnetic field estimation—establishing a new paradigm for practical high-density solid-state quantum sensors.
📝 Abstract
Quantum metrology is a promising application of quantum technologies, enabling the precise measurement of weak external fields at a local scale. In typical quantum sensing protocols, a qubit interacts with an external field, and the amplitude of the field is estimated by analyzing the expectation value of a measured observable. Sensitivity can, in principle, be enhanced by increasing the number of qubits within a fixed volume, thereby maintaining spatial resolution. However, at high qubit densities, inter-qubit interactions induce complex many-body dynamics, resulting in multiple oscillations in the expectation value of the observable even for small field amplitudes. This ambiguity reduces the dynamic range of the sensing protocol. We propose a method to overcome the limitation in quantum metrology by adopting a quantum circuit learning framework using a parameterized quantum circuit to approximate a target function by optimizing the circuit parameters. In our method, after the qubits interact with the external field, we apply a sequence of parameterized quantum gates and measure a suitable observable. By optimizing the gate parameters, the expectation value is trained to exhibit a monotonic response within a target range of field amplitudes, thereby eliminating multiple oscillations and enhancing the dynamic range. This method offers a strategy for improving quantum sensing performance in dense qubit systems.