🤖 AI Summary
This work addresses the joint problem of intrinsic dimension estimation and geometry-invariant embedding learning for nonlinear manifold-structured data. We propose an autoencoder framework incorporating orthogonality constraints on hidden-layer gradients. Methodologically, we establish, for the first time, a theoretical connection between gradient orthogonality in neural network latent spaces and the local tangent space dimension of the underlying manifold; this enables simultaneous intrinsic dimension estimation, learning of invertible embedding mappings, and construction of coordinate-invariant representations under local Lie group actions on low-dimensional submanifolds. Our key contribution lies in unifying gradient orthogonality with differential-geometric structure, thereby extending invariant representation learning to continuous group actions. Experiments on standard benchmarks demonstrate accurate intrinsic dimension estimation, disentangled representations, and robust group-invariant embeddings, validating both theoretical soundness and algorithmic robustness.
📝 Abstract
Conformal Autoencoders are a neural network architecture that imposes orthogonality conditions between the gradients of latent variables towards achieving disentangled representations of data. In this letter we show that orthogonality relations within the latent layer of the network can be leveraged to infer the intrinsic dimensionality of nonlinear manifold data sets (locally characterized by the dimension of their tangent space), while simultaneously computing encoding and decoding (embedding) maps. We outline the relevant theory relying on differential geometry, and describe the corresponding gradient-descent optimization algorithm. The method is applied to standard data sets and we highlight its applicability, advantages, and shortcomings. In addition, we demonstrate that the same computational technology can be used to build coordinate invariance to local group actions when defined only on a (reduced) submanifold of the embedding space.