Put Your Money Where The Mouse is: Tools and Techniques for Making Informed Design Decisions

Put Your Money Where The Mouse is: Tools and Techniques for Making Informed Design Decisions

Last spring we spoke at the Library Technology Conference about the process we used to redesign our library’s website, and in particular, how we focused on our homepage, our most-used page by far. Having launched our site, we were eager to learn how well we were meeting the needs of our users, and set out to gather detailed data about user interactions with our homepage. We developed a custom javascript library to capture user interaction data on our homepage, anonymously recording each link that is clicked and every search query performed in Google Analytics. Analysis of the click events has shown us clearly those features of our page that engage our users, and those that may just be distractions. Meanwhile, the collection of search query terms allows us to examine the most common discovery trends and evaluate the relevancy of search results from our catalog. Our homepage serves many purposes and many constituencies, and every design decision is an exercise in balancing user needs and organizational priorities. Our homepage usage data gives us a realistic measure of our users’ revealed priorities, and forces necessary, if sometimes uncomfortable conversations about how we balance user productivity and efficiency against our institutional desire to promote services, news and events, and less-used parts of our collections. In this session, we’ll describe in detail the techniques we used to gather and analyze these data, as well as other methods to achieve similar results with less custom development. We’ll also discuss the changes we’ve made to our site as a result, and how we’re using this information in conversations about site and service priorities.

0c6e18570f8a5192b8cdfd809196c540?s=128

Eric Larson

March 19, 2015
Tweet

Transcript

  1. Put Your Money Where the Mouse Is. Library Technology Conference

    / 2015 Cody Hanson and Eric Larson University of Minnesota-Twin Cities Libraries
  2. Redesign GOALS

  3. GOALS Responsive Design Usable with any device. Contemporary Web
 Standards

    Pay off technical debt. Accessibility Usable by anyone. SPEEED Because fast.
  4. How’d we do? GOALS Responsive Design Usable with any device.

    Contemporary Web
 Standards Pay off technical debt. Accessibility Usable by anyone. SPEEED Because fast.
  5. GOALS Responsive Design Usable with any device. Contemporary Web
 Standards

    Pay off technical debt. Accessibility Usable by anyone. SPEEED Because fast.
  6. GOALS Responsive Design Usable with any device. Contemporary Web
 Standards

    Pay off technical debt. Accessibility Usable by anyone. SPEEED Because fast.
  7. GOALS Responsive Design Usable with any device. Contemporary Web
 Standards

    Pay off technical debt. Accessibility Usable by anyone. SPEEED Because fast.
  8. GOALS Responsive Design Usable with any device. Contemporary Web
 Standards

    Pay off technical debt. Accessibility Usable by anyone. SPEEED Because fast.
  9. What’s missing? GOALS Responsive Design Usable with any device. Contemporary

    Web
 Standards Pay off technical debt. Accessibility Usable by anyone. SPEEED Because fast.
  10. Goals / What’s missing? This stuff is responsive. It’s accessible

    (mostly). It’ screaming fast. But is it the right stuff?
  11. Goals / What’s missing? “If your UX asks the user

    to make choices... even if those choices are both clear and useful, the act of deciding is a cognitive drain.” -Kathy Sierra
 “Your app makes me fat”
  12. Without being creepy Measuring

  13. Measuring

  14. Measuring We use Google Analytics
 We use the “anonymizeIp” flag

    
 We do not track across sites 
 We use a 30 min. session length
 
 We use a 14 day campaign time-out
  15. Measuring For all data collection described in this presentation, users

    are identified only by an IP address stripped of its last octet (IPv4) or 80 bits (IPv6).
 
 We’ve made our peace with this.
  16. ǡ Analytics Web

  17. Users / They really exist MailChimp’s Personas

  18. Google Analytics Track Pageviews What they view. OS/Browser Stats What

    devices they use. Traffic Sources Where they came from. SEO / Keywords What they searched to arrive here.
  19. Custom Events What they do here. Track

  20. - Google Analytics Track / Custom Events “Events are user

    interactions with content that can be tracked independently from a web page or a screen load.”
  21. - Google Analytics Track / Custom Events “Downloads, mobile ad

    clicks, gadgets, Flash elements, AJAX embedded elements, and video plays are all examples”
  22. Users / Know what they do MailChimp’s Personas

  23. DOM Events HTML

  24. HTML Events Animation events Clipboard events Drag events Form events

    Frame/Object events Keyboard events Media events Misc events Mouse events Print events Server-sent events Touch events Transition events HTML
  25. HTML Events Animation events Clipboard events Drag events *Form events

    Frame/Object events Keyboard events Media events Misc events *Mouse events Print events Server-sent events Touch events Transition events HTML
  26. HTML Events / Mouse Events I am a large link

    on a webpage.
  27. HTML Events / Mouse Events I am a large link

    on a webpage.
  28. HTML Events / Mouse Events I am a large link

    on a webpage. onclick
  29. HTML Events / Mouse Events I am a large link

    on a webpage.
  30. HTML Events / Mouse Events I am a large link

    on a webpage. ondblclick
  31. HTML Events / Mouse Events I am a box on

    a webpage
  32. HTML Events / Mouse Events I am a box on

    a webpage
  33. HTML Events / Mouse Events onmouseenter I am a box

    on a webpage onmouseover
  34. HTML Events / Mouse Events I am a box on

    a webpage onmousemove
  35. HTML Events / Mouse Events I am a box on

    a webpage onmouseleave onmouseout
  36. HTML Events / Form Events

  37. HTML Events / Form Events Input Events What they do

    here.
  38. HTML Events / Form Events Input Events What they do

    here. onfocus The cursor has entered the text field
  39. HTML Events / Form Events Input Events What they do

    here. oninput Add a keywords into the text field
  40. HTML Events / Form Events Input Events What they do

    here. onblur Tabbed over to the submit button
  41. HTML Events / Form Events Input Events What they do

    here. onsubmit Press enter and submit the form
  42. Collection Stats

  43. Collect / Google Analytics

  44. Collect / analytics.js

  45. GitHub Collect / Profit! Google Analytics

  46. Users / They’re here, doing things Personified by Jason Travis

  47. Users / They’re here, doing things Personified by Jason Travis

  48. Custom Events What they do here. Track

  49. Collect / custom event format Google Analytics

  50. Collect / custom event args Google Analytics Value Type Required

    Description Category String Yes e.g. button Action String Yes e.g. click Label String No e.g. nav buttons Value Number No e.g. 4 times
  51. Collect / custom event example Google Analytics Google Analytics

  52. Collect / custom event example Google Analytics Google Analytics

  53. Collect / custom event example Google Analytics Google Analytics

  54. GitHub Collect / jquery.js Google Analytics jQuery

  55. Collect / jquery example jQuery calling G.A.

  56. GitHub Collect / ga-event-track.js

  57. Collect / U of M library style

  58. Collect / Link Event

  59. Collect / Link Event

  60. Collect / Form Event

  61. Collect / Form Event

  62. Javascript COLLECT Add analytics.js Google gives you this. Add jQuery.js

    Useful javascript library, you probably use it. Add ga-event-track.js My jQuery plugin to capture this data. Link to GitHub project https://github.com/UMNLibraries/ga-event-track
  63. Analysis Stats

  64. Analysis searches

  65. Analysis Links

  66. Analysis Prioritization A responsive design
 demands strict priorities. Have we

    prioritized the
 right stuff? Example
  67. Analysis Prioritization What are our users’
 revealed priorities? 80/20

  68. Javascript Analysis 0%

  69. Javascript Analysis 44%

  70. Javascript Analysis 49%

  71. Javascript Analysis 54%

  72. Javascript Analysis 59%

  73. Javascript Analysis 64%

  74. Javascript Analysis 67%

  75. Javascript Analysis 70%

  76. Javascript Analysis 72%

  77. Javascript Analysis 74%

  78. Javascript Analysis 76%

  79. Javascript Analysis 78%

  80. Javascript Analysis 79.5%

  81. Javascript Analysis 81%

  82. Analysis 80/20 Search MyU Sign In All Databases Google Scholar

    ASPremier Advanced Search PubMed PsycINFO Web of Science JSTOR OneStop WorldCat
  83. Analysis 80/4 13 of 364 tracked links account for over

    80% of interactions It’s not 80/20, it’s 80/4.
  84. Analysis 80/4

  85. Analysis 80/4

  86. Analysis 80/4 Sh*t.

  87. Analysis Hm.

  88. Analysis Hmm.

  89. Analysis / Questions

  90. Analysis / Questions Do we trust this method,
 and these

    data?
  91. Analysis / Questions What is the value of an interaction


    versus an impression?
  92. Analysis / Questions Are these, in fact, our users'
 true

    priorities?
  93. Analysis / Questions If so, do they match our 


    organizational priorities?
  94. Analysis / Questions Should they?

  95. Analysis / Questions Can we influence these results?

  96. Analysis / Questions Should we influence these results?

  97. Analysis / Questions What amount of interaction
 should warrant home

    page presence?
  98. Analysis / Questions When should strategic concerns
 trump demonstrated needs?

  99. Analysis / Questions This is just the beginning.

  100. Thank You all Library Technology Conference / 2015 Cody Hanson

    and Eric Larson University of Minnesota-Twin Cities Libraries