$30 off During Our Annual Pro Sale. View Details »

Analysis and Validation - Lecture 4 - Information Visualisation (4019538FNR)

Analysis and Validation - Lecture 4 - Information Visualisation (4019538FNR)

This lecture forms part of the course Information Visualisation given at the Vrije Universiteit Brussel.

Beat Signer
PRO

March 09, 2023
Tweet

More Decks by Beat Signer

Other Decks in Education

Transcript

  1. 2 December 2005
    Information Visualisation
    Analysis and Validation
    Prof. Beat Signer
    Department of Computer Science
    Vrije Universiteit Brussel
    beatsigner.com

    View Slide

  2. Beat Signer - Department of Computer Science - [email protected] 2
    March 9, 2023
    Four Nested Levels of Vis Design

    View Slide

  3. Beat Signer - Department of Computer Science - [email protected] 3
    March 9, 2023
    Validation
    ▪ Huge vis design space and most designs are ineffective
    ▪ Validate choices right from the beginning of the design
    process
    ▪ top-down design (problem-driven work)
    - start at top situation domain level
    - major challenge at data/task abstraction; mainly use existing idioms
    ▪ bottom up design (technique-driven work)
    - invention of new idioms or algorithms
    ▪ Independently validate all four levels of the design
    ▪ domain validation
    ▪ abstraction validation (what and why)
    ▪ idiom validation (how)
    ▪ algorithm validation

    View Slide

  4. Beat Signer - Department of Computer Science - [email protected] 4
    March 9, 2023
    Four Nested Levels of Vis Design
    ▪ Output from upstream level is input to downstream level
    ▪ errors at upstream levels propagate to downstream levels
    ▪ highly iterative design process

    View Slide

  5. Beat Signer - Department of Computer Science - [email protected] 5
    March 9, 2023
    Domain Situation
    ▪ A domain situation is defined by
    ▪ target users
    ▪ domain of interest of target users
    - each domain might have its own vocabulary
    ▪ data of target users
    ▪ questions (tasks) of target users
    ▪ Outcome of design process
    ▪ understanding of user needs (user-centred design)
    - e.g. via observations or interviews
    ▪ Challenges and risks
    ▪ users can often not clearly specify their analysis needs
    ▪ designers make assumptions (rather than engaging with users)

    View Slide

  6. Beat Signer - Department of Computer Science - [email protected] 6
    March 9, 2023
    Data & Task Abstraction
    ▪ Abstract from answers to domain-specific questions
    at upstream to a generic representation
    ▪ questions from different domain situations can map to same
    abstract vis tasks
    - e.g. browsing, comparing or summarising
    ▪ Design abstract data
    ▪ data from upstream is often transformed to something different
    ▪ determine which data type supports a visual representation that
    addresses a user's problem

    View Slide

  7. Beat Signer - Department of Computer Science - [email protected] 7
    March 9, 2023
    Visual Encoding & Interaction Idiom
    ▪ Specific way (idiom) to create and manipulate
    the visual representation of abstract data
    ▪ visual encoding idiom
    - create a "picture" out of the data (what do users see?)
    ▪ interaction idiom
    - how do users change what they see?
    ▪ Design space of the combination of visual encoding and
    interaction idioms is very large
    ▪ data and task abstractions help to reduce the number of potential
    visual encoding and interaction idioms
    ▪ decision about good or bad matches based on human abilities
    (visual perception and memory)

    View Slide

  8. Beat Signer - Department of Computer Science - [email protected] 8
    March 9, 2023
    Word Tree Example

    View Slide

  9. Beat Signer - Department of Computer Science - [email protected] 9
    March 9, 2023
    Algorithm
    ▪ Implementation of visual encoding and interaction idioms
    ▪ can design different algorithms to realise the same idiom
    ▪ Various factors might impact the choice of a specific
    algorithm
    ▪ computational complexity (performance)
    ▪ memory usage
    ▪ level of match with visual encoding idiom
    ▪ Separate algorithm design (computational issues) from
    idiom design (human perception issues)

    View Slide

  10. Beat Signer - Department of Computer Science - [email protected] 10
    March 9, 2023
    Threats to Validity
    ▪ Each design level has their own threats to validity
    ▪ wrong problem, wrong abstraction, wrong idiom or wrong algorithm

    View Slide

  11. Beat Signer - Department of Computer Science - [email protected] 11
    March 9, 2023
    Validation Approaches

    View Slide

  12. Beat Signer - Department of Computer Science - [email protected] 12
    March 9, 2023
    Validation Approaches …
    ▪ Can perform an immediate or downstream validation
    ▪ downstream dependencies add to the difficulty of validation
    - e.g. poor algorithm design may have a negative effect when validating
    an interaction technique
    ▪ use of mock-ups for early downstream evaluation
    ▪ Mismatches
    ▪ mismatch between the level at which the benefit is claimed and
    the chosen validation methodology
    - e.g. benefit of new visual encoding idiom cannot be validated by measuring the
    performance of the algorithm used downstream
    ▪ carefully select the subset of validation methods matching
    the levels of design where contributions are claimed

    View Slide

  13. Beat Signer - Department of Computer Science - [email protected] 13
    March 9, 2023
    Domain Validation
    ▪ A field study can help to validate that we are going
    to address real user needs
    ▪ observe people in real-world settings
    ▪ semi-structured interviews (e.g. contextual inquiry)
    ▪ Downstream validation can for example investigate a
    solution's adoption rate by the target audience
    ▪ see what target users do (without bringing them into a lab)

    View Slide

  14. Beat Signer - Department of Computer Science - [email protected] 14
    March 9, 2023
    Abstraction Validation
    ▪ Identified task abstraction and data abstraction might
    not solve the target audience's problems
    ▪ Downstream validation includes testing the solution with
    members of the target audience
    ▪ anecdotal (qualitative) feedback whether the tool is useful
    ▪ field study to observe and document how the target audience
    uses the tool in their real-world workflow
    - observe changes in behaviour rather than documenting existing work practices

    View Slide

  15. Beat Signer - Department of Computer Science - [email protected] 15
    March 9, 2023
    Idiom Validation
    ▪ Justify the design of the idiom with respect to known
    perceptual and cognitive principles
    ▪ Heuristic evaluation or expert reviews may be used to
    ensure that no known guidelines are violated
    ▪ Downstream validation
    ▪ controlled experiments in a lab setting (lab study)
    - controlled experiments for testing the performance of specific idioms
    - measure time and errors for given tasks
    ▪ presentation and qualitative discussion of results
    - show images or videos of the solution to the target audience
    ▪ quantitative measurement of resulting visualisations (quality metrics)
    such as the number of edge crossings for node-link graph
    ▪ usability studies

    View Slide

  16. Beat Signer - Department of Computer Science - [email protected] 16
    March 9, 2023
    Algorithm Validation
    ▪ Analyse computational complexity of algorithms
    ▪ number of items in the dataset, number of pixels, …
    ▪ Downstream validation
    ▪ execution time
    ▪ memory consumption
    ▪ scalability
    ▪ Correctness of algorithm
    ▪ does implementation meet the idiom specification
    ▪ Standard benchmarks might help to compare algorithms

    View Slide

  17. Beat Signer - Department of Computer Science - [email protected] 17
    March 9, 2023
    MatrixExplorer

    View Slide

  18. Beat Signer - Department of Computer Science - [email protected] 18
    March 9, 2023
    MatrixExplorer Validation Methods

    View Slide

  19. Beat Signer - Department of Computer Science - [email protected] 19
    March 9, 2023
    Genealogical Graphs

    View Slide

  20. Beat Signer - Department of Computer Science - [email protected] 20
    March 9, 2023
    Genealogical Graphs Validation Methods

    View Slide

  21. Beat Signer - Department of Computer Science - [email protected] 21
    March 9, 2023
    Flow Maps
    Migration from California Top ten states that sent migrants to California (green)
    and to New York (blue)

    View Slide

  22. Beat Signer - Department of Computer Science - [email protected] 22
    March 9, 2023
    Flow Maps Validation Methods

    View Slide

  23. Beat Signer - Department of Computer Science - [email protected] 23
    March 9, 2023
    LiveRAC

    View Slide

  24. Beat Signer - Department of Computer Science - [email protected] 24
    March 9, 2023
    LiveRAC Validation Methods

    View Slide

  25. Beat Signer - Department of Computer Science - [email protected] 26
    March 9, 2023
    Sizing the Horizon Validation Methods

    View Slide

  26. Beat Signer - Department of Computer Science - [email protected] 27
    March 9, 2023
    Exercise 4
    ▪ Analysis and Validation

    View Slide

  27. Beat Signer - Department of Computer Science - [email protected] 28
    March 9, 2023
    Further Reading
    ▪ This lecture is mainly based on the
    book Visualization Analysis & Design
    ▪ chapter 4
    - Analysis: Four Levels for Validation

    View Slide

  28. Beat Signer - Department of Computer Science - [email protected] 29
    March 9, 2023
    References
    ▪ Visualization Analysis & Design, Tamara
    Munzner, Taylor & Francis Inc, (Har/Psc edition),
    May, November 2014,
    ISBN-13: 978-1466508910
    ▪ N. Henry and J.-D. Fekete, MatrixExplorer: A Dual-
    Representation System to Explore Social Networks,
    IEEE Transactions of Visualization and Computer
    Graphics 12(5), September 2006
    ▪ https://doi.org/10.1109/TVCG.2006.160

    View Slide

  29. Beat Signer - Department of Computer Science - [email protected] 30
    March 9, 2023
    References …
    ▪ M.J. McGuffin and R. Balakrishnan, Interactive
    Visualization of Genealogical Graphs, Proceedings of
    InfoVis 2005, Minneapolis, USA, October 2005
    ▪ https://doi.org/10.1109/INFVIS.2005.1532124
    ▪ video: https://www.youtube.com/watch?v=-FkRzDegzAo
    ▪ D. Phan, L. Xiao, R. Yeh, P. Hanrahan and T. Winograd,
    Flow Map Layout, Proceedings of InfoVis 2005,
    Minneapolis, USA, October 2005
    ▪ https://doi.org/10.1109/INFVIS.2005.1532150

    View Slide

  30. Beat Signer - Department of Computer Science - [email protected] 31
    March 9, 2023
    References …
    ▪ P. McLachlan, T. Munzner, E. Koutsofios and
    S. North, LiveRAC: Interactive Visual Exploration of
    System Management TimeSeries Data, Proceedings of
    CHI 2008, Florence, Italy, April 2008
    ▪ https://doi.org/10.1145/1357054.1357286
    ▪ J. Heer, N. Kong and M. Agrawala, Sizing the Horizon:
    The Effects of Chart Size and Layering on the Graphical
    Perception of Time Series Visualizations, Proceedings of
    CHI 2008, Florence, Italy, April 2008
    ▪ https://doi.org/10.1145/1518701.1518897

    View Slide

  31. 2 December 2005
    Next Lecture
    Data Presentation

    View Slide