Information Visualization and Proxemics: Design Opportunities and Empirical Findings

Abstract

People typically interact with information visualizations using a mouse. Their physical movement, orientation, and distance to visualizations are rarely used as input. We explore how to use such spatial relations among people and visualizations (i.e., proxemics) to drive interaction with visualizations, focusing here on the spatial relations between a single user and visualizations on a large display. We implement interaction techniques that zoom and pan, query and relate, and adapt visualizations based on tracking of users' position in relation to a large high-resolution display. Alternative prototypes are tested in three user studies and compared with baseline conditions that use a mouse. Our aim is to gain empirical data on the usefulness of a range of design possibilities and to generate more ideas. Among other things, the results show promise for changing zoom level or visual representation with the user's physical distance to a large display. We discuss possible benefits and potential issues to avoid when designing information visualizations that use proxemics.

Citation

Mikkel Rønne Jakobsen, Yonas Sahlemariam Haile, Søren Knudsen, Kasper Hornbæk. Information Visualization and Proxemics: Design Opportunities and Empirical Findings. In IEEE Transactions on Visualization and Computer Graphics (IEEE TVCG). 2013. vol. 19, no. 12, Dec. http://dx.doi.org/10.1109/TVCG.2013.166

Paper

Supplemental material

fig5.1afig5.1b fig5.1c fig5.1d fig5.1e fig5.1ffig5.2a fig5.2b fig5.2cfig5.3a fig5.3b fig5.3c fig5.3d