Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Predicting efficiency without user testing.

Predicting efficiency without user testing.

What if I told you there was a way to calculate task times without users?

Miguel Ángel Pérez

October 05, 2015
Tweet

More Decks by Miguel Ángel Pérez

Other Decks in Design

Transcript

  1. Efficiency === Time on task Imagine time-clocking software used by

    a large company: 1. Task time x frequency x employee’s pay 2. Saving off a few seconds on a task could mean saving hours per week 3. Concrete ROI on better usability 4. ???? 5. PROFIT!!!!
  2. We don’t always have access to users. Most times we

    are relying on our intuition or it’s too late to make major changes.
  3. What if I told you there was a way to

    calculate task times without users?
  4. A brief history... 1. We started getting interested in measuring

    how long repetitive actions take back in the industrial revolution 2. Fitts’ Law allows us to predict the time required to rapidly move to a target area based on distance and size 3. Xerox PARC used principles of Fitts’ Law while toying with ideas for input peripherals like the mouse 4. The research at PARC led to the 1983 book The Psychology of Human-Computer Interaction, which introduced a way to predict task times 5. The Keystroke level-model was developed as an easy way to mark up and calculate task flows 6. A little while ago, some researchers extended the Keystroke level-model to include operators for touch devices
  5. Keystroke-level model Using KLM you can predict a user’s task

    time to within 10-20% of the actual time. It would take testing 80 users to have the same margin of error. • Homing: Moving Hand to Keyboard or Mouse: 360ms • Clicking: the Mouse: 230 ms • Pointing: with the Mouse: 1100ms • Mental Operations: (Deciding what to Do): 1350 ms https://en.wikipedia.org/wiki/Keystroke-level_model
  6. Evolving KLM for touch devices Recent research proposes adding a

    few new operators for taking touch devices into account. • Drag: an item: 2047ms • Homing: switching keyboard mode: 1928ms • Keystroke: on a virtual keyboard: 13ms • Rotate: the device: 474ms • Swipe: on the screen: 384ms • Tap: and element on the screen: 110ms • Zoom: pinch gesture: 527ms http://dl.acm.org/citation.cfm?id=2638532
  7. Keystroke Level Model • Homing: 360 ms • Clicking: 230

    ms • Pointing: 1100 ms • Mental Operation: 1350 ms Touch Level Model • Drag: 2047 ms • Virtual keyboard: 1928 ms • Rotate: 474 ms • Swipe: 384 ms • Tap: 110ms • Zoom: 527 ms Predicted Time: 2660 ms http://www.measuringu.com/predicted-times.php
  8. Keystroke Level Model • Homing: 360 ms • Clicking: 230

    ms • Pointing: 1100 ms • Mental Operation: 1350 ms Touch Level Model • Drag: 2047 ms • Virtual keyboard: 1928 ms • Rotate: 474 ms • Swipe: 384 ms • Tap: 110ms • Zoom: 527 ms Predicted Time: 1330 ms http://www.measuringu.com/predicted-times.php
  9. Keystroke Level Model • Homing: 360 ms • Clicking: 230

    ms • Pointing: 1100 ms • Mental Operation: 1350 ms Touch Level Model • Drag: 2047 ms • Virtual keyboard: 1928 ms • Rotate: 474 ms • Swipe: 384 ms • Tap: 110ms • Zoom: 527 ms https://www.travelocity.com
  10. Keystroke Level Model • Homing: 360 ms • Clicking: 230

    ms • Pointing: 1100 ms • Mental Operation: 1350 ms Touch Level Model • Drag: 2047 ms • Virtual keyboard: 1928 ms • Rotate: 474 ms • Swipe: 384 ms • Tap: 110ms • Zoom: 527 ms https://www.travelocity.com