REV-INR: Regularized Evidential Implicit Neural Representation for Uncertainty-Aware Volume Visualization

📅 2026-01-25
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of quantifying predictive uncertainty in traditional implicit neural representations (INRs), which hinders reliable assessment in volume data reconstruction and visualization. To this end, we propose a regularized evidential implicit neural representation by integrating evidential deep learning into the INR framework for the first time. Our method simultaneously outputs the reconstructed volume data values along with both aleatoric and epistemic uncertainties at the coordinate level within a single forward pass. This approach achieves accurate joint uncertainty estimation while preserving high-fidelity reconstruction quality and the fastest inference speed among comparable methods. Consequently, it significantly enhances model interpretability and the trustworthiness of visualizations derived from uncertain reconstructions.

Technology Category

Application Category

📝 Abstract
Applications of Implicit Neural Representations (INRs) have emerged as a promising deep learning approach for compactly representing large volumetric datasets. These models can act as surrogates for volume data, enabling efficient storage and on-demand reconstruction via model predictions. However, conventional deterministic INRs only provide value predictions without insights into the model's prediction uncertainty or the impact of inherent noisiness in the data. This limitation can lead to unreliable data interpretation and visualization due to prediction inaccuracies in the reconstructed volume. Identifying erroneous results extracted from model-predicted data may be infeasible, as raw data may be unavailable due to its large size. To address this challenge, we introduce REV-INR, Regularized Evidential Implicit Neural Representation, which learns to predict data values accurately along with the associated coordinate-level data uncertainty and model uncertainty using only a single forward pass of the trained REV-INR during inference. By comprehensively comparing and contrasting REV-INR with existing well-established deep uncertainty estimation methods, we show that REV-INR achieves the best volume reconstruction quality with robust data (aleatoric) and model (epistemic) uncertainty estimates using the fastest inference time. Consequently, we demonstrate that REV-INR facilitates assessment of the reliability and trustworthiness of the extracted isosurfaces and volume visualization results, enabling analyses to be solely driven by model-predicted data.
Problem

Research questions and friction points this paper is trying to address.

Implicit Neural Representation
Uncertainty Quantification
Volume Visualization
Aleatoric Uncertainty
Epistemic Uncertainty
Innovation

Methods, ideas, or system contributions that make the work stand out.

Implicit Neural Representation
Uncertainty Quantification
Evidential Deep Learning
Volume Visualization
Aleatoric and Epistemic Uncertainty
🔎 Similar Papers
No similar papers found.
S
Shanu Saklani
INSIGHT Lab., IIT Kanpur, India
T
Tushar M. Athawale
Oak Ridge National Laboratory, USA
N
Nairita Pal
IIT Kharagpur, India
D
David Pugmire
Oak Ridge National Laboratory, USA
C
Christopher R. Johnson
University of Utah, USA
Soumya Dutta
Soumya Dutta
Assistant Professor of Computer Science at IIT Kanpur
Machine LearningVisual ComputingxAIData ScienceHPC