🤖 AI Summary
To address reduced readability in data visualizations under dark mode—caused by color semantic distortion and insufficient contrast—this paper proposes an automated palette adaptation algorithm. The method jointly optimizes luminance contrast, semantic color consistency, and inter-color perceptual differences within a perceptually uniform color space, enabling the first semantic-preserving automatic dark-mode color mapping. Unlike manual tuning or naive light-to-dark inversion, our approach models human visual perception and supports adaptive recoloring across diverse chart types (e.g., line and bar charts). Evaluated through case studies, expert reviews, system testing, and user experiments, the generated dark-mode visualizations significantly outperform baseline methods: readability improves by +32.7%, while semantic fidelity and aesthetic quality also show marked gains.
📝 Abstract
Dark mode has gained widespread adoption across mobile platforms due to its benefits in reducing eye strain and conserving battery life. However, while the mobile system switches to dark mode, most visualizations remain designed for light mode, causing visual disruptions. Existing methods, such as manual adjustment or color inversion, are either time-consuming or fail to preserve the semantic meaning of colors in visualizations, making them less effective in dark mode. To address this challenge, we propose Chameleon, an algorithm that automatically transforms light mode visualizations into dark mode while maintaining visual clarity and color semantics. By optimizing for luminance contrast, color consistency, and adjacent color differences, Chameleon ensures that the transformed visualizations are legible and visually coherent. Our evaluation includes case study, expert interview, system evaluation, and a user study, and these demonstrate that Chameleon is effective at translating visualizations for dark mode.