🤖 AI Summary
This work proposes a novel paradigm that, for the first time, integrates reinforcement learning–driven musculoskeletal forward simulation with touch interaction logs to directly synthesize high-fidelity human motion sequences grounded in biomechanical principles. Addressing the limitation of existing touch logs—which poorly capture users’ true biomechanical actions—the approach leverages a tightly coupled physics engine and software simulator to reconstruct rich kinematic data in real time, including movement trajectories, velocity, accuracy, and effort. Experimental validation on large-scale datasets demonstrates that the generated motions exhibit strong biomechanical plausibility, offering an effective solution for inferring detailed human motor behavior from sparse interaction logs.
📝 Abstract
Touch data from mobile devices are collected at scale but reveal little about the interactions that produce them. While biomechanical simulations can illuminate motor control processes, they have not yet been developed for touch interactions. To close this gap, we propose a novel computational problem: synthesizing plausible motion directly from logs. Our key insight is a reinforcement learning-driven musculoskeletal forward simulation that generates biomechanically plausible motion sequences consistent with events recorded in touch logs. We achieve this by integrating a software emulator into a physics simulator, allowing biomechanical models to manipulate real applications in real-time. Log2Motion produces rich syntheses of user movements from touch logs, including estimates of motion, speed, accuracy, and effort. We assess the plausibility of generated movements by comparing against human data from a motion capture study and prior findings, and demonstrate Log2Motion in a large-scale dataset. Biomechanical motion synthesis provides a new way to understand log data, illuminating the ergonomics and motor control underlying touch interactions.