$XX^{t}$ Can Be Faster

📅 2025-05-14
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the efficient computation of the symmetric matrix product $XX^ op$. We propose the first algorithm discovery framework that integrates machine learning–guided search with combinatorial optimization—bypassing the limitations of conventional algebraic derivation. Our method, RXTX, automatically discovers low-complexity computational graphs while preserving numerical stability, achieving an average 5% reduction in multiply-accumulate (MAC) operations. The improvement is especially pronounced for small-scale dense matrices ($n leq 128$), where RXTX consistently outperforms state-of-the-art approaches—including optimized BLAS implementations, Strassen-based variants, and recent symbolic optimization techniques—in empirical speedup. Crucially, this is the first work to introduce data-driven search into the optimization of dense symmetric matrix multiplication. By unifying learned heuristics with structured combinatorial reasoning, it establishes a novel paradigm for automating the design of fundamental linear algebra primitives.

Technology Category

Application Category

📝 Abstract
We present a new algorithm RXTX that computes product of matrix by its transpose $XX^{t}$. RXTX uses $5%$ less multiplications and additions than State-of-the-Art and achieves accelerations even for small sizes of matrix $X$. The algorithm was discovered by combining Machine Learning-based search methods with Combinatorial Optimization.
Problem

Research questions and friction points this paper is trying to address.

Develops faster algorithm for matrix self-product computation
Reduces multiplications and additions by 5%
Combines ML and optimization for discovery
Innovation

Methods, ideas, or system contributions that make the work stand out.

New algorithm RXTX for matrix product computation
Reduces multiplications and additions by 5%
Combines ML-based search with combinatorial optimization
🔎 Similar Papers
No similar papers found.