Slide 1

Slide 1 text

2 December 2005 Information Visualisation Analysis and Validation Prof. Beat Signer Department of Computer Science Vrije Universiteit Brussel beatsigner.com

Slide 2

Slide 2 text

Beat Signer - Department of Computer Science - [email protected] 2 March 7, 2024 Four Nested Levels of Vis Design

Slide 3

Slide 3 text

Beat Signer - Department of Computer Science - [email protected] 3 March 7, 2024 Validation ▪ Huge vis design space and most designs are ineffective ▪ Validate choices right from the beginning of the design process ▪ top-down design (problem-driven work) - start at top situation domain level - major challenge at data/task abstraction; mainly use existing idioms ▪ bottom-up design (technique-driven work) - invention of new idioms or algorithms ▪ Independently validate all four levels of the design ▪ domain validation ▪ abstraction validation (what and why) ▪ idiom validation (how) ▪ algorithm validation

Slide 4

Slide 4 text

Beat Signer - Department of Computer Science - [email protected] 4 March 7, 2024 Four Nested Levels of Vis Design ▪ Output from upstream level is input to downstream level ▪ errors at upstream levels propagate to downstream levels ▪ highly iterative design process

Slide 5

Slide 5 text

Beat Signer - Department of Computer Science - [email protected] 5 March 7, 2024 Domain Situation ▪ A domain situation is defined by ▪ target users ▪ domain of interest of target users - each domain might have its own vocabulary ▪ data of target users ▪ questions (tasks) of target users ▪ Outcome of design process ▪ understanding of user needs (user-centred design) - e.g. via observations or interviews ▪ Challenges and risks ▪ users can often not clearly specify their analysis needs ▪ designers make assumptions (rather than engaging with users)

Slide 6

Slide 6 text

Beat Signer - Department of Computer Science - [email protected] 6 March 7, 2024 Data & Task Abstraction ▪ Abstract from answers to domain-specific questions at upstream to a generic representation ▪ questions from different domain situations can map to the same abstract vis tasks - e.g. browsing, comparing or summarising ▪ Design abstract data ▪ data from upstream is often transformed into something different ▪ determine which data type supports a visual representation that addresses a user's problem

Slide 7

Slide 7 text

Beat Signer - Department of Computer Science - [email protected] 7 March 7, 2024 Visual Encoding & Interaction Idiom ▪ Specific way (idiom) to create and manipulate the visual representation of abstract data ▪ visual encoding idiom - create a "picture" out of the data (what do users see?) ▪ interaction idiom - how do users change what they see? ▪ Design space of the combination of visual encoding and interaction idioms is very large ▪ data and task abstractions help to reduce the number of potential visual encoding and interaction idioms ▪ decision about good or bad matches based on human abilities (visual perception and memory)

Slide 8

Slide 8 text

Beat Signer - Department of Computer Science - [email protected] 8 March 7, 2024 Word Tree Example

Slide 9

Slide 9 text

Beat Signer - Department of Computer Science - [email protected] 9 March 7, 2024 Algorithm ▪ Implementation of visual encoding and interaction idioms ▪ can design different algorithms to realise the same idiom ▪ Various factors might impact the choice of a specific algorithm ▪ computational complexity (performance) ▪ memory usage ▪ level of match with visual encoding idiom ▪ Separate algorithm design (computational issues) from idiom design (human perception issues)

Slide 10

Slide 10 text

Beat Signer - Department of Computer Science - [email protected] 10 March 7, 2024 Threats to Validity ▪ Each design level has their own threats to validity ▪ wrong problem, wrong abstraction, wrong idiom or wrong algorithm

Slide 11

Slide 11 text

Beat Signer - Department of Computer Science - [email protected] 11 March 7, 2024 Validation Approaches

Slide 12

Slide 12 text

Beat Signer - Department of Computer Science - [email protected] 12 March 7, 2024 Validation Approaches … ▪ Can perform an immediate or downstream validation ▪ downstream dependencies add to the difficulty of validation - e.g. poor algorithm design may have a negative effect when validating an interaction technique ▪ use of mock-ups for early downstream evaluation ▪ Mismatches ▪ mismatch between the level at which the benefit is claimed and the chosen validation methodology - e.g. benefit of new visual encoding idiom cannot be validated by measuring the performance of the algorithm used downstream ▪ carefully select the subset of validation methods matching the levels of design where contributions are claimed

Slide 13

Slide 13 text

Beat Signer - Department of Computer Science - [email protected] 13 March 7, 2024 Domain Validation ▪ A field study can help to validate that we are going to address real user needs ▪ observe people in real-world settings ▪ semi-structured interviews (e.g.contextual inquiry) ▪ Downstream validation can for example investigate a solution's adoption rate by the target audience ▪ see what target users do (without bringing them into a lab)

Slide 14

Slide 14 text

Beat Signer - Department of Computer Science - [email protected] 14 March 7, 2024 Abstraction Validation ▪ Identified task abstraction and data abstraction might not solve the target audience's problems ▪ Downstream validation includes testing the solution with members of the target audience ▪ anecdotal (qualitative) feedback whether the tool is useful ▪ field study to observe and document how the target audience uses the tool in their real-world workflow - observe changes in behaviour rather than documenting existing work practices

Slide 15

Slide 15 text

Beat Signer - Department of Computer Science - [email protected] 15 March 7, 2024 Idiom Validation ▪ Justify the design of the idiom with respect to known perceptual and cognitive principles ▪ Heuristic evaluation or expert reviews may be used to ensure that no known guidelines are violated ▪ Downstream validation ▪ controlled experiments in a lab setting (lab study) - controlled experiments for testing the performance of specific idioms - measure time and errors for given tasks ▪ presentation and qualitative discussion of results - show images or videos of the solution to the target audience ▪ quantitative measurement of resulting visualisations (quality metrics) such as the number of edge crossings for node-link graph ▪ usability studies

Slide 16

Slide 16 text

Beat Signer - Department of Computer Science - [email protected] 16 March 7, 2024 Algorithm Validation ▪ Analyse computational complexity of algorithms ▪ number of items in the dataset, number of pixels, … ▪ Downstream validation ▪ execution time ▪ memory consumption ▪ scalability ▪ Correctness of algorithm ▪ does implementation meet the idiom specification ▪ Standard benchmarks might help to compare algorithms

Slide 17

Slide 17 text

Beat Signer - Department of Computer Science - [email protected] 17 March 7, 2024 MatrixExplorer

Slide 18

Slide 18 text

Beat Signer - Department of Computer Science - [email protected] 18 March 7, 2024 MatrixExplorer Validation Methods

Slide 19

Slide 19 text

Beat Signer - Department of Computer Science - [email protected] 19 March 7, 2024 Genealogical Graphs

Slide 20

Slide 20 text

Beat Signer - Department of Computer Science - [email protected] 20 March 7, 2024 Genealogical Graphs Validation Methods

Slide 21

Slide 21 text

Beat Signer - Department of Computer Science - [email protected] 21 March 7, 2024 Flow Maps Migration from California Top ten states that sent migrants to California (green) and to New York (blue)

Slide 22

Slide 22 text

Beat Signer - Department of Computer Science - [email protected] 22 March 7, 2024 Flow Maps Validation Methods

Slide 23

Slide 23 text

Beat Signer - Department of Computer Science - [email protected] 23 March 7, 2024 LiveRAC

Slide 24

Slide 24 text

Beat Signer - Department of Computer Science - [email protected] 24 March 7, 2024 LiveRAC Validation Methods

Slide 25

Slide 25 text

Beat Signer - Department of Computer Science - [email protected] 26 March 7, 2024 Sizing the Horizon Validation Methods

Slide 26

Slide 26 text

Beat Signer - Department of Computer Science - [email protected] 27 March 7, 2024 Exercise 4 ▪ Analysis and Validation

Slide 27

Slide 27 text

Beat Signer - Department of Computer Science - [email protected] 28 March 7, 2024 Further Reading ▪ This lecture is mainly based on the book Visualization Analysis & Design ▪ chapter 4 - Analysis: Four Levels for Validation

Slide 28

Slide 28 text

Beat Signer - Department of Computer Science - [email protected] 29 March 7, 2024 References ▪ Visualization Analysis & Design, Tamara Munzner, Taylor & Francis Inc, (Har/Psc edition), May, November 2014, ISBN-13: 978-1466508910 ▪ N. Henry and J.-D. Fekete, MatrixExplorer: A Dual- Representation System to Explore Social Networks, IEEE Transactions of Visualization and Computer Graphics 12(5), September 2006 ▪ https://doi.org/10.1109/TVCG.2006.160

Slide 29

Slide 29 text

Beat Signer - Department of Computer Science - [email protected] 30 March 7, 2024 References … ▪ M.J. McGuffin and R. Balakrishnan, Interactive Visualization of Genealogical Graphs, Proceedings of InfoVis 2005, Minneapolis, USA, October 2005 ▪ https://doi.org/10.1109/INFVIS.2005.1532124 ▪ video: https://www.youtube.com/watch?v=-FkRzDegzAo ▪ D. Phan, L. Xiao, R. Yeh, P. Hanrahan and T. Winograd, Flow Map Layout, Proceedings of InfoVis 2005, Minneapolis, USA, October 2005 ▪ https://doi.org/10.1109/INFVIS.2005.1532150

Slide 30

Slide 30 text

Beat Signer - Department of Computer Science - [email protected] 31 March 7, 2024 References … ▪ P. McLachlan, T. Munzner, E. Koutsofios and S. North, LiveRAC: Interactive Visual Exploration of System Management TimeSeries Data, Proceedings of CHI 2008, Florence, Italy, April 2008 ▪ https://doi.org/10.1145/1357054.1357286 ▪ J. Heer, N. Kong and M. Agrawala, Sizing the Horizon: The Effects of Chart Size and Layering on the Graphical Perception of Time Series Visualizations, Proceedings of CHI 2008, Florence, Italy, April 2008 ▪ https://doi.org/10.1145/1518701.1518897

Slide 31

Slide 31 text

2 December 2005 Next Lecture Data Presentation