Published several papers, including: 'Skyfall-GS: Synthesizing Immersive 3D Urban Scenes from Satellite Imagery' (arXiv, 2025), 'See, Point, Fly: A Learning-Free VLM Framework for Universal Unmanned Aerial Navigation' (CoRL, 2025), and 'AuraFusion360: Augmented Unseen Region Alignment for Reference-based 360° Unbounded Scene Inpainting' (CVPR, 2025). Won third place in the NYCU CS Undergraduate Research Competition and received the NSTC Research Grant for University Students.
Research Experience
Software Engineer on Google’s Pixel Camera Team, developing on-device algorithms for the camera. R&D Intern at Microsoft and Backend Engineer Intern at Appier. Interned with Google’s Pixel Camera Team in Summer 2024, integrating the Segment Anything Model for mobile devices, hosted by Yu-Lin Chang and Chung-Kai Hsieh.
Education
Ph.D. student in Computer Science at National Yang Ming Chiao Tung University, advised by Prof. Yu-Lun Liu. B.S. in Computer Science from National Yang Ming Chiao Tung University with an exchange program at ETH Zurich.
Background
Research interests include 3D Reconstruction, Neural Radiance Fields (NeRFs), 3D Gaussian Splatting, Large-scale Scene Reconstruction, Satellite Imagery, 3D Generation, Urban Scene Generation, Object Generation and Manipulation, Image Segmentation, and Segment Anything Model.