Upgrade to Pro — share decks privately, control downloads, hide ads and more …

#hcid2012 - Measuring Perceptible Affordances w...

City Interaction Lab
April 17, 2012
320

#hcid2012 - Measuring Perceptible Affordances with Eyetracking - Jacques Chueke

City Interaction Lab

April 17, 2012
Tweet

More Decks by City Interaction Lab

Transcript

  1. Measuring  Percep.ble  Affordances   with  Eye  Tracking:     An

     iGoogle  Case  Study   Jacques Chueke London, UK, May 2011   George Buchanan   (1st Supervisor)   Lecturer, Centre for HCI Design   Stephanie Wilson   (2nd Supervisor)   Lecturer, Centre for HCI Design Master in Design, PUC-Rio, RJ, Brazil   PhD Researcher at the Centre for HCI Design   School of Informatics, City University London 1  
  2. CES  2009:  Hitachi's  Gesture  Remote  Control  TV  Prototype    2

      New Modes of Interaction Front-­‐facing  webcam  to  track  head  movements  for     cursor  control,  2011   MicrosoN  Surface,  2007   Xbox  360  (Kinect)  Dashboard,  2011   Tobii  LeNovo,  Jun  2011  
  3. Problem Statement • Traditional control modes of interaction (e.g. Mouse and

    Keyboard) are being substituted by NUI (physical interactions, e.g. touch, voice, gestures, eye gaze). 4   • Emerging Post-WIMP command vocabularies. • Hybrid solutions of control modes (interactions) x visual solutions (interfaces). • The ‘hidden gestures’ issue. • The ‘invisible’ control issue or ‘hidden interactions’.
  4. Carroll,  Lewis  (2009).  Alice's  Adventures  in     Wonderland  and

     Through  the  Looking-­‐Glass.      5   •  What am I supposed to do? And if I did, what’s going to happen? Problem Statement
  5. The Hidden Interactions issue 6   •  The new Windows

    8 with similar features as used in Windows Phone and Xbox 360 Dashboard.
  6. 7   The Hidden Interactions issue •  iPad and hidden

    interaction/gestures issue and some interface solutions.
  7. Methodology: iGoogle empirical study 8   3. DATA ANALYSIS 3.1.

    Deductive Approach for QUANTITATIVE Data Analysis (from Eye Tracking).   3.2. Inductive Approach for QUALITATIVE Data Analysis (from Verbalizations). 3.2. Conclusions and theory (generalization). 2.1. 10 second ‘Observation Phase’ (no comments from facilitator, no verbalization from subject).   2.2. 15 minutes ‘Observation with Think Aloud’ (TA) method (no interaction of any kind).   Questions: ‘Q1: What is this website for?’, ‘Q2: What can you do in this kind of website?’,   ‘Q3: Do you think is possible to change your screen the way you like it?’, ‘Q4: Is it possible to move anything in there?’   2. OBSERVATION PROTOCOL (Eye Tracking) 1. OBJECT AND PARTICIPANTS 1.1. iGoogle Personal Web Portal (identified as suffering of bad PA for drag-and-drop interaction)   1.2. Ten (10) participants: Three (03) beginner, four (04) intermediate and three (03) advanced expertise with Internet and computer. No previous knowledge of the portal.
  8. Methodology 9   Par.cipant,  54   Beginner  Exper.se   Par.cipant,

     22   Advanced  Exper.se   •  Gazeplot Comparison: Beginner x Advanced
  9. 12   Search   Menu_V   Menu_H   SI_Bar  

    You  Tube_Sign  In   SI_Link   SI_Op.ons   D&T_Content   You  Tube_Content   Weather_Bar   Weather_Link   Weather_Op.ons   CNET_Bar   CNET_Link   CNET_Op.ons   Simplify_Bar   Simplify_Link   Simplify_Op.ons   Sports_Bar   Sports_Link   Sports_Op.ons   You  Tube_Bar   You  Tube_Link   You  Tube_Op.ons   D&T_Bar   D&T_Link   D&T_Op.ons   Sports_Content   Weather_Content   SI_Content   CNET_Content   Simplify_Content   Epicurious_Bar   Epicurious_Link   Epicurious_Op.ons   Epicurious_Content   Quantitative Data Analysis: Categories Logo   Menu_V_ Op.ons   Epicurious_Sign  In  
  10. 13   C1:  SITE  CONTROL   C1:  SITE  CONTROL  

    C1:  SITE  CONTROL   C4:  W.       MOVE   C2:  WIDGET  CONTROL   C2:  WIDGET   CONTROL   C2:  WIDGET   CONTROL   C3:  WIDGET  CONTENT   C3:  WIDGET  CONTENT   C4:  W.       MOVE   C2:  WIDGET   CONTROL   C2:  WIDGET   CONTROL   C4:  W.       MOVE   C2:  WIDGET   CONTROL   C2:  WIDGET   CONTROL   C4:  W.       MOVE   C2:  WIDGET   CONTROL   C2:  WIDGET   CONTROL   C4:  W.       MOVE   C2:  WIDGET   CONTROL   C2:  WIDGET   CONTROL   C4:  W.       MOVE   C2:  WIDGET   CONTROL   C2:  WIDGET   CONTROL   C4:  W.       MOVE   C2:  WIDGET   CONTROL   C2:  WIDGET   CONTROL   C3:  WIDGET  CONTENT   C3:  WIDGET  CONTENT   C3:  WIDGET  CONTENT   C3:  WIDGET  CONTENT   C3:  WIDGET  CONTENT   C4:  W.       MOVE   C2:  WIDGET   CONTROL   C2:  WIDGET   CONTROL   C3:  WIDGET  CONTENT   C1:  SITE  CONTROL   C1:  SITE   CONTROL   C2:  WIDGET  CONTROL   Quantitative Data Analysis: Categories
  11. 13   C1:  SITE  CONTROL   C1:  SITE  CONTROL  

    C1:  SITE  CONTROL   C4:  W.       MOVE   C2:  WIDGET  CONTROL   C2:  WIDGET   CONTROL   C2:  WIDGET   CONTROL   C3:  WIDGET  CONTENT   C3:  WIDGET  CONTENT   C4:  W.       MOVE   C2:  WIDGET   CONTROL   C2:  WIDGET   CONTROL   C4:  W.       MOVE   C2:  WIDGET   CONTROL   C2:  WIDGET   CONTROL   C4:  W.       MOVE   C2:  WIDGET   CONTROL   C2:  WIDGET   CONTROL   C4:  W.       MOVE   C2:  WIDGET   CONTROL   C2:  WIDGET   CONTROL   C4:  W.       MOVE   C2:  WIDGET   CONTROL   C2:  WIDGET   CONTROL   C4:  W.       MOVE   C2:  WIDGET   CONTROL   C2:  WIDGET   CONTROL   C3:  WIDGET  CONTENT   C3:  WIDGET  CONTENT   C3:  WIDGET  CONTENT   C3:  WIDGET  CONTENT   C3:  WIDGET  CONTENT   C4:  W.       MOVE   C2:  WIDGET   CONTROL   C2:  WIDGET   CONTROL   C3:  WIDGET  CONTENT   C1:  SITE  CONTROL   C1:  SITE   CONTROL   C2:  WIDGET  CONTROL   Quantitative Data Analysis: Categories
  12. 14   C1:  SITE  CONTROL   C1:  SITE  CONTROL  

    C1:  SITE  CONTROL   C4:  W.       MOVE   C2:  WIDGET  CONTROL   C2:  WIDGET   CONTROL   C2:  WIDGET   CONTROL   C3:  WIDGET  CONTENT   C3:  WIDGET  CONTENT   C4:  W.       MOVE   C2:  WIDGET   CONTROL   C2:  WIDGET   CONTROL   C4:  W.       MOVE   C2:  WIDGET   CONTROL   C2:  WIDGET   CONTROL   C4:  W.       MOVE   C2:  WIDGET   CONTROL   C2:  WIDGET   CONTROL   C4:  W.       MOVE   C2:  WIDGET   CONTROL   C2:  WIDGET   CONTROL   C4:  W.       MOVE   C2:  WIDGET   CONTROL   C2:  WIDGET   CONTROL   C4:  W.       MOVE   C2:  WIDGET   CONTROL   C2:  WIDGET   CONTROL   C3:  WIDGET  CONTENT   C3:  WIDGET  CONTENT   C3:  WIDGET  CONTENT   C3:  WIDGET  CONTENT   C3:  WIDGET  CONTENT   C4:  W.       MOVE   C2:  WIDGET   CONTROL   C2:  WIDGET   CONTROL   C3:  WIDGET  CONTENT   C1:  SITE  CONTROL   C1:  SITE   CONTROL   C2:  WIDGET  CONTROL   Quantitative Data Analysis: Categories
  13. 15   C1:  SITE  CONTROL   C1:  SITE  CONTROL  

    C1:  SITE  CONTROL   C4:  W.       MOVE   C2:  WIDGET  CONTROL   C2:  WIDGET   CONTROL   C2:  WIDGET   CONTROL   C3:  WIDGET  CONTENT   C3:  WIDGET  CONTENT   C4:  W.       MOVE   C2:  WIDGET   CONTROL   C2:  WIDGET   CONTROL   C4:  W.       MOVE   C2:  WIDGET   CONTROL   C2:  WIDGET   CONTROL   C4:  W.       MOVE   C2:  WIDGET   CONTROL   C2:  WIDGET   CONTROL   C4:  W.       MOVE   C2:  WIDGET   CONTROL   C2:  WIDGET   CONTROL   C4:  W.       MOVE   C2:  WIDGET   CONTROL   C2:  WIDGET   CONTROL   C4:  W.       MOVE   C2:  WIDGET   CONTROL   C2:  WIDGET   CONTROL   C3:  WIDGET  CONTENT   C3:  WIDGET  CONTENT   C3:  WIDGET  CONTENT   C3:  WIDGET  CONTENT   C3:  WIDGET  CONTENT   C4:  W.       MOVE   C2:  WIDGET   CONTROL   C2:  WIDGET   CONTROL   C3:  WIDGET  CONTENT   C1:  SITE  CONTROL   C1:  SITE   CONTROL   C2:  WIDGET  CONTROL   Quantitative Data Analysis: Categories
  14. 16   C1:  SITE  CONTROL   C1:  SITE  CONTROL  

    C1:  SITE  CONTROL   C4:  W.       MOVE   C2:  WIDGET  CONTROL   C2:  WIDGET   CONTROL   C2:  WIDGET   CONTROL   C3:  WIDGET  CONTENT   C3:  WIDGET  CONTENT   C4:  W.       MOVE   C2:  WIDGET   CONTROL   C2:  WIDGET   CONTROL   C4:  W.       MOVE   C2:  WIDGET   CONTROL   C2:  WIDGET   CONTROL   C4:  W.       MOVE   C2:  WIDGET   CONTROL   C2:  WIDGET   CONTROL   C4:  W.       MOVE   C2:  WIDGET   CONTROL   C2:  WIDGET   CONTROL   C4:  W.       MOVE   C2:  WIDGET   CONTROL   C2:  WIDGET   CONTROL   C4:  W.       MOVE   C2:  WIDGET   CONTROL   C2:  WIDGET   CONTROL   C3:  WIDGET  CONTENT   C3:  WIDGET  CONTENT   C3:  WIDGET  CONTENT   C3:  WIDGET  CONTENT   C3:  WIDGET  CONTENT   C4:  W.       MOVE   C2:  WIDGET   CONTROL   C2:  WIDGET   CONTROL   C3:  WIDGET  CONTENT   C1:  SITE  CONTROL   C1:  SITE   CONTROL   C2:  WIDGET  CONTROL   Quantitative Data Analysis: Categories
  15. 17   C1:  SITE  CONTROL   C1:  SITE  CONTROL  

    C1:  SITE  CONTROL   C4:  W.       MOVE   C2:  WIDGET  CONTROL   C2:  WIDGET   CONTROL   C2:  WIDGET   CONTROL   C3:  WIDGET  CONTENT   C3:  WIDGET  CONTENT   C4:  W.       MOVE   C2:  WIDGET   CONTROL   C2:  WIDGET   CONTROL   C4:  W.       MOVE   C2:  WIDGET   CONTROL   C2:  WIDGET   CONTROL   C4:  W.       MOVE   C2:  WIDGET   CONTROL   C2:  WIDGET   CONTROL   C4:  W.       MOVE   C2:  WIDGET   CONTROL   C2:  WIDGET   CONTROL   C4:  W.       MOVE   C2:  WIDGET   CONTROL   C2:  WIDGET   CONTROL   C4:  W.       MOVE   C2:  WIDGET   CONTROL   C2:  WIDGET   CONTROL   C3:  WIDGET  CONTENT   C3:  WIDGET  CONTENT   C3:  WIDGET  CONTENT   C3:  WIDGET  CONTENT   C3:  WIDGET  CONTENT   C4:  W.       MOVE   C2:  WIDGET   CONTROL   C2:  WIDGET   CONTROL   C3:  WIDGET  CONTENT   C1:  SITE  CONTROL   C1:  SITE   CONTROL   C2:  WIDGET  CONTROL   Quantitative Data Analysis: Categories
  16. 20   Quantitative Data Analysis: Heatmap •  ‘Q3: Do you

    think is possible to change your screen the way you like it?’
  17. 21   Quantitative Data Analysis: Heatmap •  ‘Q4: Is it

    possible to move anything in there?’
  18. 22   First Fixation Duration (Seconds) graphic, presenting queries x

    AOI categories. x 1.73! x 0.5! x 4.03! x 0.35! x 1.41! x 0.44! x 3.2! x 0.29! Quantitative Data Analysis: Real Value x Estimated Value
  19. 23   Total Fixation Duration (Seconds) graphic, presenting queries x

    AOI categories. x 3.6! x 1.14! x 8.4! x 0.74! x 4.34! x 1.36! x 10.0! x 0.89! Quantitative Data Analysis: Real Value x Estimated Value
  20. 24   Quantitative Data Analysis: Real Value x Estimated Value

    Fixation Count (Saccades) graphic, presenting queries x AOI categories. x 14.0! x 4.46! x 32.8! x 2.9! x 12! x 3.9! x 28.8! x 2.5!
  21. Empirical Study Conclusions •  Beginner expertise participants were unable to

    describe the difference between regular portals and personal web portals. 25   •  No novice participants could spot Drag and Drop Interactions. •  We could infer whether the questions had an effect over participant’s gaze over the home page displayed. The response is: yes for the very first question and not particularly for the following. •  I asked different question about control and participants did look for similar things, which suggests they are much more scanning over familiar items that they know have that property. There were quite a few accidental landings over content, which is expected, once is hard to avoid looking at pictures. •  Area 2 (Widget Control Widget) and Area 4 (Widget Move) were observed in larger scale in comparison with projected numbers based on the area size. •  Where someone looks doesn't tell you what he or she is thinking. Only qualitative data will start unveiling what people are thinking about while they're looking.
  22. Conclusions and Future Work •  A prototype with Post-WIMP characteristics

    and NUI mode of interaction will be built in order to to understand how users visually scan such interfaces to obtain the gist of its interactive potential. 26   •  By developing a methodology for an empirical study, which focuses on observation prior to any interaction, we are willing to identify what elements people will focus on NUI screens. •  Quantitative (Eye Tracking) and Qualitative (Verbalizations) data will be combined to produce conclusions about what kind of information can be obtained with the protocol – and how can this data be adapted to indicate better design interactions with NUI systems. •  Variables that would affect how one interprets an interface:   •  FAMILIARITY with the technology.   •  VISIBILITY of controls. Perceptible Affordances issue. Beyond sight beyond mind...   •  People TEST the environment to get response. SCAFFOLDING and WITHDRAWING concept.
  23. Bibliography Beaudouin-Lafon, M. (November 2000). "Instrumental Interaction: An Interaction Model

    for Designing Post-WIMP User Interfaces". CHI '00: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. The Hague, The Netherlands: ACM Press. pp. 446–453. doi:10.1145/332040.332473. ISBN 1-58113-216-6. http:// www.daimi.au.dk/CPnets/CPN2000/download/chi2000.pdf. Breeze, James. Eye Tracking: Best Way to Test Rich App Usability. UX Magazine, access on 25 November 2010. (http://www.uxmag.com/technology/eye-tracking-the-best-way-to-test-rich-app-usability) Buxton, W. (2001). Less is More (More or Less), in P. Denning (Ed.), The Invisible Future: The seamless integration of technology in everyday life. New York: McGraw Hill, 145–179 ITU Internet Reports 2005: The Internet of Things – Executive Summary.   Dam, A. (February 1997). "POST-WIMP User Interfaces". Communications of the ACM (ACM Press) 40 (2): pp. 63– 67. doi:10.1145/253671.253708. Dourish, P. Where the Action Is: The Foundations of Embodied Interaction. A Bradford Book: The MIT Press, USA, 2004.   Ehmke & Wilson, 2007. Identifying Web Usability Problems from Eye-Tracking Data. Published by the British Computer Society. People and Computers XXI – HCI…but not the way we know it: Proceedings of HCI 2007.   Gaver, W. Technology Affordances. Copyright 1991 ACM 0-89791-383-3/91/0004/0079.   Gentner, D. and Nielsen, J. (April 1993). "The Anti-Mac Interface". Communications of the ACM (ACM Press) 39 (8): pp. 70–82. http://www.useit.com/papers/anti-mac.html. Jacob, R. et al. (2008). "Reality-Based Interaction: A Framework for Post-WIMP Interfaces". CHI '08: Proceedings of the Twenty-Sixth Annual SIGCHI Conference on Human Factors in Computing Systems. Florence, Italy: ACM. pp. 201–210. doi:http://doi.acm.org.ezproxy.lib.ucf.edu/10.1145/1357054.1357089. ISBN 978-1-60558-011-1. 28  
  24. Bibliography McGrenere, J., Ho, W. (2000). Affordances: Clarifying and Evolving

    a Concept. Procs. of Graphic Interfaces 2000, Montreal, May 2000.   McNaughton, J. Utilizing Emerging Multi-touch Table Designs. Technology Enhanced Learning Research Group - Durham University. TR-TEL-10-01.   Nielsen, J. (April 1993). "Noncommand User Interfaces". Communications of the ACM (ACM Press) 36 (4): pp. 83–99. doi: 10.1145/255950.153582. http://www.useit.com/papers/noncommand.html. Norman, D. (1999). Affordance, Conventions and Design. In ACM Interactions, (May + June, 1999), 38-42.   Picard, R. Affective Computing. The MIT Press, Cambridge, Massachusetts. London, England, 1998.   PREECE, Jenny. SHARP, Helen. ROGERS, Yvonne. Interaction Design: Beyond Human-Computer Interaction [2nd edition]. John Wiley & Sons, Ltd. West Sussex, UK, 2009. Ramduny-Ellis, D.; Dix, A.; Hare, J.; Gill, S. Physicality: Towards a Less-GUI Interface (Preface). Procs. Third International Workshop on Physicality. Cambridge, England, 2009.     Robert St. Amant. 1999. User interface affordances in a planning representation. Hum.-Comput. Interact. 14, 3 (September 1999), 317-354.   Sorensen, M. Making a Case for Biological and Tangible Interfaces. Proceedings of the Third International Workshop on Physicality. Cambridge, England, 2009.   Sternberg, R. Cognitive Psychology. Wadsworth, Cengage Learning. Belmont, CA, USA, 2009, 2006.   Vyas, D., Chisalita, C. Veer, G. Affordance in Interaction. ECCE '06 Proceedings of the 13th Eurpoean conference on Cognitive ergonomics: trust and control in complex socio-technical systems. ACM New York, NY, USA ©2006 ISBN: 978-3-906509-23-5   WIGDOR,  Deniel.  WIXON,  Dennis.  Brave  NUI  World:  designing  natural  user  interfaces  for  touch  and  gesture.  Morgan  Kauffman   Publishers,  USA,  2011.   29