Chenming Wu
Scholar

Chenming Wu

Google Scholar ID: eOkkQWUAAAAJ
Researcher, Baidu Inc.
RoboticsGraphics3D VisionComputational Design
Citations & Impact
All-time
Citations
1,510
 
H-index
20
 
i10-index
29
 
Publications
20
 
Co-authors
0
 
Resume (English only)
Academic Achievements
  • Published multiple papers, including:
  • - NeuS-PIR: Learning Relightable Neural Surface using Pre-Integrated Rendering
  • - GS-RoadPatching: Inpainting Gaussians via 3D Searching and Placing for Driving Scenes
  • - Surfel-based Gaussian Inverse Rendering for Fast and Relightable Dynamic Human Reconstruction from Monocular Videos
  • - GIR: 3D Gaussian Inverse Rendering for Relightable Scene Factorization
  • - U-ViLAR: Uncertainty-Aware Visual Localization for Autonomous Driving via Differentiable Association and Registration
  • - DriVerse: Navigation World Model for Driving Simulation via Multimodal Trajectory Prompting and Motion Alignment
  • - TexGaussian: Generating High-quality PBR Material via Octree-based 3D Gaussian Splatting
  • - Splatter-360: Generalizable 360 Gaussian Splatting for Wide-baseline Panoramic Images
  • - VDG: Vision-Only Dynamic Gaussian for Driving Simulation
  • - Gaussian-LIC: Real-Time Photo-Realistic SLAM with Gaussian Splatting and LiDAR-Inertial-Camera Fusion
  • - XLD: A Cross-Lane Dataset for Benchmarking Novel Driving View Synthesis
Research Experience
  • Researcher at Baidu Inc.; previously worked at Tencent; interned at AiFi Inc.
Education
  • Received Ph.D. degree from Graphics and Geometric Computing Group, Department of Computer Science and Technology, Tsinghua University in 2020. Visited the Graphics and Imaging Lab, Paul G. Allen School of CSE, University of Washington (Seattle, WA) in 2019, and Department of Design Engineering of Delft University of Technology (Netherlands) in 2017.
Background
  • Research interests include computational design, video/3D generation, and robotics. Currently working at Baidu Inc., previously worked at Tencent, and interned at AiFi Inc.
Miscellany
  • Actively looking for full-time employees and interns to work on reconstruction and generation.
Co-authors
0 total
Co-authors: 0 (list not available)