🤖 AI Summary
Conventional linear metamaterials face intrinsic limitations in physical computing, particularly for nonlinear computational paradigms requiring precise mapping between discrete lattice degrees of freedom and metamaterial eigenstates. Method: We propose a geometric-mapping paradigm—a tight-binding model grounded in nonlinear coordinate transformation—enabling exact correspondence between lattice sites and nonlinear excitations for the first time. This framework unifies three distinct computing paradigms: combinatorial optimization (e.g., Ising solving), in-memory computing (mechanical racetrack memory), and neuromorphic computing (speech classification). Contribution/Results: Leveraging microstructure inverse design and experimental validation, we fabricate three functional prototypes, demonstrating the feasibility of programmable physical computing using nonlinear metamaterials. Crucially, we establish the first rigorous, cross-scale, cross-domain theoretical mapping framework valid under nonlinear conditions—overcoming the fundamental constraints of linear metamaterial-based computation.
📝 Abstract
Designing metamaterials that carry out advanced computations poses a significant challenge. A powerful design strategy splits the problem into two steps: First, encoding the desired functionality in a discrete or tight-binding model, and second, identifying a metamaterial geometry that conforms to the model. Applying this approach to information-processing tasks requires accurately mapping nonlinearity -- an essential element for computation -- from discrete models to geometries. Here we formulate this mapping through a nonlinear coordinate transformation that accurately connects tight-binding degrees of freedom to metamaterial excitations in the nonlinear regime. This transformation allows us to design information-processing metamaterials across the broad range of computations that can be expressed as tight-binding models, a capability we showcase with three examples based on three different computing paradigms: a coherent Ising machine that approximates combinatorial optimization problems through energy minimization, a mechanical racetrack memory exemplifying in-memory computing, and a speech classification metamaterial based on analog neuromorphic computing.