Upgrade to Pro — share decks privately, control downloads, hide ads and more …

"Is This Normal?" - Clearhead Opticon 2014 Presentation

"Is This Normal?" - Clearhead Opticon 2014 Presentation

A presentation by Clearhead co-founder, Ryan Garner, and our client, Jessica Vasbinder, from Warner Music Group, about the very real and very normal challenges of building a successful A/B testing program.

Clearhead

April 25, 2014
Tweet

More Decks by Clearhead

Other Decks in Technology

Transcript

  1. Is This Normal?
 Debunking myths & getting real about the

    challenges optimization programs face.
 Ryan Garner for Clearhead Jessica Vasbinder for Warner Music Group
  2. Hi. I’m Ryan Garner. ! This is Jessica Vasbinder. !

    ! This is the part where we tell you about ourselves. ! ! ! (@ryantgarner) (@entweetment)
  3. And we are going to do this by asking 1

    simple question along the way... ! ! ! ! !
  4. We ran our first set of tests. Made buttons bigger,

    changed copy, tested different colors. And zip, zero… ! Is this normal? ! ! !
  5. • It’s okay to crawl before you walk, but set

    expectations. – You may (probably will) NOT materially improve your bottom line. ! • Risk (Disruption) & Reward (Lift) are correlated. – Small tests often provide small returns…especially for smaller businesses!
  6. We continue to build and release new features and designs

    that will never be tested. ! Is this normal? ! ! !
  7. • Be patient. Be persistent. • Look for opportunities to

    test the roadmap…even when you are not asked. • Show value and then ask for (dedicated) resources.
  8. We don’t know where to start testing or which ideas

    are good and which are bad or how to keep momentum. ! Is this normal? ! ! !
  9. • “Always start at the bottom of the funnel.” •

    “No, start at the top of the funnel.” • “Test the whole funnel simultaneously.” • “NO, Find the bottlenecks IN the funnel.” • “Start with calls to action…copy, color, size.” • “Navigation is always a good place to test.” • “Forms, baby, forms.”
  10. • Your roadmap. • Your customers. • Your competitors. •

    Vendors trying to sell you stuff. • From throughout your company. • User testing. • Your other tests!
  11. • Your program is only as good as your ideas

    (hypotheses). • Invest significantly in their development. • Make the process open and accessible. • Store ideas and knowledge where the tribe can find it.
  12. OMFG! We have 1,000+ testing ideas. Lots and lots of

    them seem good. Now where do we start? ! Is this normal? ! ! !
  13. • Create a simple scoring system and get the cream

    to the top. – it should need no real explanation. – no new acronyms! – Why? No hurt feelings & better ideas!! ! • Then use your own judgment. ! • Repeat regularly.
  14. Is it to: ! A. Collect pristine data in the

    pursuit of truth and science. ! or ! B. Grow the business…the sooner the better.
  15. • Knowing it all is a luxury few can truly

    afford. – Can you? – How much traffic do you have? ! • Assuming limited resources and limited traffic, test the bigger ideas. – Don’t be afraid to bundle changes. (there - we said it) ! • If it grows the business, it grows the business.
  16. We launched a test that broke our site! I thought

    building tests was easy. ! Is this normal? ! ! !
  17. • Let’s be real here. When you launch a test,

    you are pushing new code onto your production website. – Should just anyone be doing this? ! • Do you really want a high velocity, high returning optimization program? Of course you do! – You need your best developers doing the coding and testing. Not your marketers. – Even then, expect hiccups. ! • Monitor, monitor, monitor.. – Test monitoring is critical, especially the first 24 hours following a launch. – Data usually has a story. If yours makes absolutely no sense, your test could be broken.
  18. Leadership wants a growth #, but when we add up

    the “lift” from all our winning tests, it’s triple our actual business. ! Is this normal? ! ! !
  19. • There is no perfect way to do this. !

    • Stress the difference between statistical significance and observed lift. ! • Trust that there is improvement and move on. – The only accurate way to confirm lift is to re-test. – Do you really want to do that? ! • And if you must show something… – Show the full range of possibilities. – Add lots of these “****”.
  20. ***

  21. No one knows WTF a real hypothesis is or how

    to write them correctly. ! Is this normal? ! ! ! !