Summary
We developed and validated PREVis, a reliable questionnaire to measure how readable people find data visualizations.
This work received a Best Paper Honorable Mention Award at IEEE Visualization 2024.
Although readability is recognized as an essential quality of data visualizations, so far there has not been a unified definition of the construct in the context of visual representations. As a result, researchers often lack guidance for determining how to ask people to rate their perceived readability of a visualization. To address this issue, we engaged in a rigorous process to develop the first validated instrument targeted at the subjective readability of visual data representations. Our final instrument consists of 11 items across 4 dimensions: understandability, layout clarity, readability of data values, and readability of data patterns.
Researchers and practitioners can easily use this instrument as part of their evaluations to compare the perceived readability of different visual data representations. PREVis can complement results from controlled experiments on user task performance or provide additional data during in-depth qualitative work such as design iterations when developing a new technique. We provide the questionnaire as a PDF with implementation guidelines on our OSF repository.
Beyond this instrument, we contribute a discussion of how researchers have previously assessed visualization readability, and an analysis of the factors underlying perceived readability in visual data representations.
I presented this paper at IEEE . Watch the 9-minutes presentation below!
Anne-Flore Cabouat, Tingying He, Petra Isenberg, and Tobias Isenberg. PREVis: Perceived Readability Evaluation for Visualizations. IEEE Transactions on Visualization and Computer Graphics, 31, 2025. To appear.