π€ AI Summary
Existing state-space models (SSMs) such as HiPPO are constrained by orthogonal closed-form bases, limiting their capacity to model long-range dependencies effectively. To address this, we propose a novel SSM construction based on Daubechies waveletsβnon-orthogonal, compactly supported, and inherently multiscale with joint time-frequency localization. This is the first integration of such wavelets into the SaFARi general framework, eliminating reliance on orthogonality and enabling flexible, frame-driven parameterization. Our approach substantially enhances memory capacity and inference efficiency, outperforming conventional basis-function-based SSMs on long-sequence modeling tasks. The work extends the representational boundaries of SSMs and provides a more robust, expressive foundational modeling layer for architectures like S4 and Mamba.
π Abstract
State-Space Models (SSMs) have proven to be powerful tools for modeling long-range dependencies in sequential data. While the recent method known as HiPPO has demonstrated strong performance, and formed the basis for machine learning models S4 and Mamba, it remains limited by its reliance on closed-form solutions for a few specific, well-behaved bases. The SaFARi framework generalized this approach, enabling the construction of SSMs from arbitrary frames, including non-orthogonal and redundant ones, thus allowing an infinite diversity of possible"species"within the SSM family. In this paper, we introduce WaLRUS (Wavelets for Long-range Representation Using SSMs), a new implementation of SaFARi built from Daubechies wavelets.