A/B Testing to Improve UX

245cee81a9c424266e5e401d844ea881?s=47 Lara Hogan
October 19, 2011

A/B Testing to Improve UX

A basic overview of how A/B testing in software and web development can help you improve your users' experience.

245cee81a9c424266e5e401d844ea881?s=128

Lara Hogan

October 19, 2011
Tweet

Transcript

  1. A/B  Test  to     Improve  the  User  Experience  

    by  Lara  Swanson   (Lead  Front  End  Dev  at  Dyn)  
  2. Background   •  Dyn  =  DNS  and  Email   Delivery

      – E-­‐commerce  site   – ApplicaHon   management   – User  seIngs   – MigraHon  tools  
  3. What  is  an  A/B  test?   •  Analyze  user  interacHon

      with  an  element:   – Content  –  Design   – Layout    –  AcHon   •  Create  two  versions  of   the  element  to  test   •  “Winning”  version    =   happy  users!   Picture  by  Kevin  Cornell  via  A  List  Apart  
  4. App  landing  page  with   installaHon  screenshots     App

     landing  page  with   installaHon  video   A B
  5. Don’t  miss  out  on   becoming  a  VIP  user.  

    Sign  up  now!     Don’t  be  an  idiot;   become  a  VIP!   A B
  6. “Buy”  buZon  adds  the   product  at  $20  for  1

     year     “Buy”  buZon  adds  the   product  at  $35  for  2  years   A B
  7. Running  an  A/B  test   1.  Split  users  into  two

      segments   2.  Randomly  show   users  one  of  the  two   versions   3.  Track  what  users  do   on  each  version  
  8. Conversion  Tracking   Determine  the   “conversion”  to  track:  

    •  Next  user  acHon   (next  screen,  next   click)   •  Final  user  goal   (checkout,  sign  up)  
  9. App  landing  page   with  installaHon   screenshots    

    App  landing  page   with  installaHon   video   A B }measure  %   new  user   completed   installaHons  
  10. Don’t  miss  out  on   becoming  a  VIP  user.  

    Sign  up  now!     Don’t  be  an  idiot;   become  a  VIP!   A B }measure  %   of  users   who   signed  up  
  11. “Buy”  buZon  adds   the  product  at  $20   for

     1  year     “Buy”  buZon  adds   the  product  at  $35   for  2  years   A B }measure   average   order   value  per   completed   checkout  
  12. Conversion  Tracking   •  StaHsHcal  significance   – 95%  confidence  level

      to  determine  “winner”   •  You’ll  need  enough   users  and  conversions   have  significant   results   My  favorite  calculator:   usereffect.com/split-­‐test-­‐calculator  
  13. AddiHonal  Data   Examine  data  for   addiHonal  user  

    acHons  beyond   the  primary   conversion  rate  
  14. AddiHonal  Data   Segment  your   user  data  by  

    various   demographics  
  15. MulHvariate  Tests   Create  mulHple  versions   and  test  them

     all  at  the   same  Hme     (use  adcomparator.com’s   Taguchi-­‐Based  calculator)   (source)  
  16. Best  PracHces   •  Don’t  let  tests  go  on  

    forever   •  Test  “screams”  rather   than  “whispers”   •  Use  tests  to  solve  UX   design  disagreements   Photo  by  Poppy  Thomas-­‐Hill  (source)  
  17. Resources   •  whichtestwon.com   •  alistapart.com/arHcles/a-­‐primer-­‐on-­‐a-­‐b-­‐ tesHng/   • 

    smashingmagazine.com/the-­‐ulHmate-­‐guide-­‐to-­‐ a-­‐b-­‐tesHng/   •  abtests.com  (next  slides)  
  18. Green  links   Red  beveled  buZons  

  19. Red  beveled  buZons   Winner   5%  more  people  

    clicked  
  20. Has  “social  proof”   No  “social  proof”  

  21. No  “social  proof”   Winner   102%  more   people

     signed  up  
  22. Large  screenshots   Significantly  longer  page   Small  screenshots  

    Shorter  page  
  23. Large  screenshots   Significantly  longer  page   Winner   85%

     more  people   downloaded  the   trial  
  24. Has  name  of  blog   No  name  of  blog  

  25. No  name  of  blog   60%  more  people  signed  up

      Winner  
  26. Prominent  headline   Bulleted  benefits  of  registraHon   More  screenshots

      Sign  up  now  buZon  at  top  
  27. Prominent  headline   Bulleted  benefits  of  registraHon   Winner  

    128%  more   signups  
  28. QuesHons?