🤖 AI Summary
Existing Transformer models for LHC jet tagging in high-energy physics suffer from prohibitively high computational complexity, rendering them unsuitable for hardware-trigger systems requiring sub-microsecond real-time inference. Method: This work introduces the first integration of multi-head and linear attention mechanisms into the hls4ml toolchain, enabling end-to-end Transformer deployment on a single FPGA. We propose a synergistic optimization combining fine-grained quantization and distributed arithmetic, co-designed with custom hardware architecture. Contribution/Results: The implementation achieves an inference latency of ~100 ns—improving upon state-of-the-art FPGA-based accelerators by one to two orders of magnitude—while significantly reducing resource utilization. This represents the first practical, hardware-efficient Transformer implementation validated for next-generation, high-luminosity LHC real-time trigger systems.
📝 Abstract
We present the first sub-microsecond transformer implementation on an FPGA achieving competitive performance for state-of-the-art high-energy physics benchmarks. Transformers have shown exceptional performance on multiple tasks in modern machine learning applications, including jet tagging at the CERN Large Hadron Collider (LHC). However, their computational complexity prohibits use in real-time applications, such as the hardware trigger system of the collider experiments up until now. In this work, we demonstrate the first application of transformers for jet tagging on FPGAs, achieving $mathcal{O}(100)$ nanosecond latency with superior performance compared to alternative baseline models. We leverage high-granularity quantization and distributed arithmetic optimization to fit the entire transformer model on a single FPGA, achieving the required throughput and latency. Furthermore, we add multi-head attention and linear attention support to hls4ml, making our work accessible to the broader fast machine learning community. This work advances the next-generation trigger systems for the High Luminosity LHC, enabling the use of transformers for real-time applications in high-energy physics and beyond.