🤖 AI Summary
This work addresses the lack of quantitative error guarantees for noisy quantum neural networks when approximating expectation-type objective functions arising in finance and related domains. It establishes, for the first time, a quantitative universal approximation theorem for such models, providing explicit error bounds. The theoretical framework integrates function approximation theory with realistic noisy quantum computation models, specifically tailored to objectives expressed as expectations. Numerical experiments conducted on actual quantum hardware validate the derived error bounds, demonstrating their tightness and practical relevance. The results confirm that the proposed approach effectively approximates canonical objective functions in quantitative finance, thereby offering the first theoretically rigorous approximation framework for noisy quantum machine learning with formal error guarantees.
📝 Abstract
We provide here a universal approximation theorem with precise quantitative error bounds for noisy quantum neural networks. We focus on applications to Quantitative Finance, where target functions are often given as expectations. We further provide a detailed numerical analysis, testing our results on actual noisy quantum hardware.