Seven guiding scenarios for information visualization evaluation

TitleSeven guiding scenarios for information visualization evaluation
Publication TypeReports
Year of Publication2011
AuthorsLam H, Bertini E, Isenberg P, Plaisant C, Carpendale S
Date Published2011///
InstitutionDepartment of Computer Science, University of Calgary

We take a new, scenario based look at evaluation in information visualization. Our seven scenarios, evaluatingvisual data analysis and reasoning, evaluating user performance, evaluating user experience, evaluating environments and work
practices, evaluating communication through visualization, automated evaluation of visualizations, and evaluating collaborative
data analysis were derived through an extensive literature review of over 800 visualization publications. These scenarios are
described through their goals, the types of questions they embody and illustrated through example studies. Through this broad
survey and the distillation of these scenarios we make two contributions. One, we encapsulate the current practices in the
information visualization research community and, two, we provide a different approach to reaching decisions about what might
be the most effective evaluation of a given information visualization. For example, if the research goals or evaluative questions
are known they can be used to map to specific scenarios, where practical existing examples can be considered for effective
evaluation approaches.