🤖 AI Summary
This work addresses the dual challenges of efficiency and visual fidelity in interactive medical volume visualization with arbitrary-angle slicing for mobile virtual reality. We propose an optimized framework based on 3D Gaussian Splatting that restructures the neural inference pipeline of ClipGS, integrates precomputed multi-view slice states into a unified rendering structure, and introduces a gradient-guided opacity modulation mechanism. Our approach achieves, for the first time, real-time high-fidelity arbitrary-angle slice rendering of medical volumes on consumer-grade mobile VR devices, delivering near offline-rendering quality while significantly enhancing interactivity and system usability.
📝 Abstract
High-fidelity cinematic medical visualization on mobile virtual reality (VR) remains challenging. Although ClipGS enables cross-sectional exploration via 3D Gaussian Splatting, it lacks arbitrary-angle slicing on consumer-grade VR headsets. To achieve real-time interactive performance, we introduce ClipGS-VR and restructure ClipGS's neural inference into a consolidated dataset, integrating high-fidelity layers from multiple pre-computed slicing states into a unified rendering structure. Our framework further supports arbitrary-angle slicing via gradient-based opacity modulation for smooth, visually coherent rendering. Evaluations confirm our approach maintains visual fidelity comparable to offline results while offering superior usability and interaction efficiency.