- XARP: A Human-First and AI Agent-Ready Extended Reality Toolkit (2025)
- GraspR: Model of Spatial User Preferences for Adaptive Grasp Interfaces (2025)
- MixITS: Design Toolkit for Task Support in Mixed Reality (2025)
- Conversational AI Probe: Agentic Design Workflows (2025)
- Grasp Pressure: Visualization for Task Assistance (2024)
- GraV: Kinematic Simulation for Grasp-Based UI Design (2024)
- Teaching AI: Mentoring High School Students (2024)
- Social Agents: Hobbes meets LLMs (2024)
- Virtual Steps: VR Walking for a Lifelong-Wheelchair User (2024)
- ModBand: Egocentric Multimodal Sensing (2023)
- V-Buddy: Conversational Interface for Users with Hand Motor Disability (2023)
- ARLang: Language Learning with Outdoor AR (2023)
- XR Task Guidance: Prototypes of XR UIs for Physical Task Guidance (2023)
- ARfy: Geometric Alignment of 3D Scenes to Augmented Reality (2022)
Research Experience
- Research Projects: XARP, GraspR, MixITS, Conversational AI Probe, Grasp Pressure Visualization for Task Assistance, GraV, Teaching AI, Social Agents, Virtual Steps, ModBand, V-Buddy, ARLang, XR Task Guidance, ARfy, etc.
- Position: PhD Candidate
Education
PhD Candidate, UCSB/HAX Lab
Background
Research Interests: Building AI and XR toolkits to enable personalized solutions for users' bodies, minds, and environments. Key research questions include how to balance scale and personalization, how to identify and model critical contextual and user factors for design, and how to preserve human agency in this process.
Miscellany
Personal Interests: Accessibility design, co-design methods, conversational AI, task support, etc.