🤖 AI Summary
This work investigates the performance of Polarization-Adjusted Convolutional (PAC) lattices over noisy channels. Addressing the high construction complexity of conventional lattice designs and the reliance of capacity-approaching proofs on random coding, we propose a structured lattice construction inspired by polar codes: we define polar lattices directly from generator matrices, rigorously establish their structural equivalence to PAC lattices, and—uniquely—unify the characterization of their structural advantages over additive white Gaussian noise (AWGN) channels. We theoretically prove that PAC lattices are AWGN-good: they asymptotically achieve the lattice coding capacity limit while admitting encoding and decoding complexities of $O(n log n)$. This work breaks away from the traditional random construction paradigm, providing a novel framework for high-dimensional lattice coding that simultaneously ensures low computational complexity, deterministic construction, and capacity-approaching performance.
📝 Abstract
This paper aims to provide a comprehensive introduction to lattices constructed based on polar-like codes and demonstrate some of their key properties, such as AWGN goodness. We first present polar lattices directly from the perspective of their generator matrix. Next, we discuss their connection with the recently proposed PAC (polarization adjusted convolutional) lattices and analyze the structural advantages of PAC lattices, through which the AWGN-goodness of PAC lattices can be conveniently demonstrated.