Local Interpolation via Low-Rank Tensor Trains

📅 2026-01-07
🏛️ arXiv.org
📈 Citations: 1
Influential: 0
📄 PDF
🤖 AI Summary
High-dimensional grid data represented in the tensor train (TT) format often suffer from rank explosion due to global unfolding, hindering efficient interpolation and compression. This work proposes a low-rank TT local interpolation framework that starts from a coarse-grid TT representation and constructs a fine-grid TT with uniformly bounded tail ranks through multiscale local refinement. The method achieves, for the first time, an ℓ² error bound independent of the total number of cores, exponential compression rates at fixed accuracy, and logarithmic computational complexity with respect to the number of grid points. Its efficacy is demonstrated on 1D/2D/3D tasks—including airfoil mask embedding, image super-resolution, and synthetic turbulent noise—and it enables direct generation of fractal noise fields with logarithmic complexity.

Technology Category

Application Category

📝 Abstract
Tensor Train (TT) decompositions provide a powerful framework to compress grid-structured data, such as sampled function values, on regular Cartesian grids. Such high compression, in turn, enables efficient high-dimensional computations. Exact TT representations are only available for simple analytic functions. Furthermore, global polynomial or Fourier expansions typically yield TT-ranks that grow proportionally with the number of basis terms. State-of-the-art methods are often prohibitively expensive or fail to recover the underlying low-rank structure. We propose a low-rank TT interpolation framework that, given a TT describing a discrete (scalar-, vector-, or tensor-valued) function on a coarse regular grid with $n$ cores, constructs a finer-scale version of the same function represented by a TT with $n+m$ cores, where the last $m$ cores maintain constant rank. Our method guarantees a $\ell^{2}$-norm error bound independent of the total number of cores, achieves exponential compression at fixed accuracy, and admits logarithmic complexity with respect of the number of grid points. We validate its performance through numerical experiments, including 1D, 2D, and 3D applications such as: 2D and 3D airfoil mask embeddings, image super-resolution, and synthetic noise fields such as 3D synthetic turbulence. In particular, we generate fractal noise fields directly in TT format with logarithmic complexity and memory. This work opens a path to scalable TT-native solvers with complex geometries and multiscale generative models, with implications from scientific simulation to imaging and real-time graphics.
Problem

Research questions and friction points this paper is trying to address.

Tensor Train
low-rank interpolation
high-dimensional data
grid refinement
multiscale representation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Tensor Train
low-rank interpolation
multiscale representation
logarithmic complexity
fractal noise generation
🔎 Similar Papers
No similar papers found.
S
Siddhartha E. Guzman
Quantum Research Center, Technology Innovation Institute, Abu Dhabi, UAE
E
Egor Tiunov
Quantum Research Center, Technology Innovation Institute, Abu Dhabi, UAE
Leandro Aolita
Leandro Aolita
Quantum Research Centre - Technology Innovation Institute
Quantum algorithms - quantum information and computation