Upgrade to Pro — share decks privately, control downloads, hide ads and more …

More Better Quality Coverage

More Better Quality Coverage

I've given this talk many times. This deck is from a talk at DevReach 2018. You can see the video here: https://youtu.be/dk-gLlpYb3o

A talk on getting great quality coverage--testing the right things at the right time, and how to think about properly splitting off automated versus exploratory testing.

Jim Holmes

May 05, 2017
Tweet

More Decks by Jim Holmes

Other Decks in Programming

Transcript

  1. More Better Quality Coverage
    `

    View Slide

  2. Jim Holmes


    @aJimHolmes


    FrazzledDad.com


    [email protected]

    View Slide

  3. Deck:


    SpeakerDeck.com/JimHolmes

    View Slide

  4. Goals:


    Better conversations


    Better Quality Coverage


    BETTER STUFF!

    View Slide

  5. Who


    “assures quality”


    ?

    View Slide

  6. The


    Stakeholder

    View Slide

  7. What’s


    “Quality”

    View Slide

  8. “Quality is something of value to
    someone.”


    —Jerry Weinberg

    View Slide

  9. What’s


    “Quality”

    View Slide

  10. • Customer needs


    • Feature
    fi
    t


    • Solves the scenarios we know about


    • Secure


    • Performant


    • Works well in production


    • Plays nicely with our craptistic data

    View Slide

  11. “Please, God,


    make it stop
    hurting so bad.”

    View Slide

  12. Cut Risk

    View Slide

  13. No lawsuits

    View Slide

  14. “Quality”


    also means


    “Enough information to make my
    next decision.”

    View Slide

  15. Business Quality


    is not the same


    as


    Technical Quality

    View Slide

  16. Where do conversations about
    Quality


    happen?

    View Slide

  17. ALL THE TIME

    View Slide

  18. Ideation


    Release Planning/Design


    Iteration Planning


    Building


    UAT

    View Slide

  19. Ideation =>


    Validate assumptions


    “Should we build this?”

    View Slide

  20. Release Planning
    =>


    Are scope and basic function right?


    What integrations?


    What User Acceptance criteria?


    What scenarios and data needed?

    View Slide

  21. Iteration Planning =>


    Do we know “Why?”


    Do we know (sort of) “How?”


    Good acceptance criteria?


    Good data?

    View Slide

  22. Building =>


    Pair up


    More test scenarios?


    Clarity of value and “how?”


    View Slide

  23. UAT =>


    “Here’s all the stuff we know’s
    tested. Let me go gonzo.”

    View Slide

  24. The Thing

    View Slide

  25. Appliance
    manufacturer

    View Slide

  26. Wants to create model
    con
    fi
    gurations for future years.


    Currently done on paper.


    Or Excel.

    View Slide

  27. e.g.


    foreach model:


    how many units do I build with
    speci
    fi
    c con
    fi
    gurations?

    View Slide

  28. View Slide

  29. System must take in to account


    • Model production rules


    • Inventory rules


    • Capacity restrictions

    View Slide

  30. Risk:


    Signi
    fi
    cant revenue loss if ratios
    are off.

    View Slide

  31. Ideation

    View Slide

  32. Stakeholder
    :

    “I want to create, edit and view
    future years’ model con
    fi
    gs
    .

    I want to use it on web and
    mobile.”

    View Slide

  33. Help business understand


    total cost


    (actual plus opportunity)

    View Slide

  34. Is this feasible,


    Or should we just start drinking
    now?

    View Slide

  35. Quality feedback


    for


    Ideation

    View Slide

  36. • Riskiest part of biz idea?


    • Biggest value for biz idea?


    • Customer audience?


    • Target platform(s)?


    • Integrations with existing systems?


    • Architectural impacts?

    View Slide

  37. “Our basic con
    fi
    g system’s data
    access is unstable, and we have
    data consistency/accuracy
    errors.”

    View Slide

  38. “You’re asking for a wide range
    of mobile device support—that
    explodes our testing and
    development effort.”

    View Slide

  39. “You said you want to scale to
    support concurrent access by
    all of China.


    “We currently have six people
    who do this task.”

    View Slide

  40. Desired outcome:


    Stakeholder makes informed
    decision

    View Slide

  41. Load and Security are


    business value concerns

    View Slide

  42. Considerations


    • What platforms?


    • What’s reasonable load?


    • How secure?


    • What’s biz value?


    • What happens if info is leaked?

    View Slide

  43. Outcomes


    • Drop mobile


    • Use existing security


    • Reasonable load expectations

    View Slide

  44. Release
    Planning
    &

    Design

    View Slide

  45. At this point we’re starting to think
    about


    what


    and


    how

    View Slide

  46. Initial design ideas


    • Use central data mart


    • Pull existing inventory data


    • Kendo UI for grid

    View Slide

  47. Elaborate and specify UAT
    criteria

    View Slide

  48. Discuss infrastructure needs

    View Slide

  49. Mitigate known issues

    View Slide

  50. Considerations


    • Business scenario
    s

    • Acceptance criteria


    • Infrastructure / integration points


    • Data and environment needs


    • Performance needs


    • Security

    View Slide

  51. Outcomes


    • Concerns of perf on client systems


    • NOT testing Kendo Grid


    • Signi
    fi
    cant test data requirements


    • Comfortable with security approach

    View Slide

  52. Iteration
    Planning

    View Slide

  53. We’re Gonna
    Do Some
    Stuff!

    View Slide

  54. Do we know enough to start
    building and testing?

    View Slide

  55. Timeline / dependencies for this
    iteration

    View Slide

  56. • Why are we building this?


    • Do we have test data yet?


    • Environments ready?

    View Slide

  57. Considerations


    • Test data


    • Scenarios


    • Automated vs. exploratory
    (initial)

    View Slide

  58. Outcomes


    • Most, not all, test data ready


    • What’s not ready can be tested later


    • Dependencies in place


    • Good to move forward

    View Slide

  59. Building

    View Slide

  60. Let’s Build
    Stuff!

    View Slide

  61. Advantages of Dev/Tester Pairing:


    • Immediate feedback


    • Shared knowledge


    • Cross pollination


    • Great test coverage

    View Slide

  62. Dev-Tester Collaboration Example


    • “This use case isn’t clear!”


    • “What about this edge case?”


    • “Now I understand REST!”

    View Slide

  63. Considerations


    • What isn’t clear?


    • Did we miss something?


    • Tests in the right place


    • Integration, Unit, JS in UI, functional

    View Slide

  64. Outcomes


    • Little or no overlap of tests


    • New use cases discovered, resolved


    • Added to test data


    • BUILT AND SHIPPED WORKING STUFF!

    View Slide

  65. UAT

    View Slide

  66. Total focus on Quality


    • No wasted effort


    • UAT focuses only on gaps


    • Earlier efforts pay off

    View Slide

  67. Considerations


    • Understanding of test coverage


    • Ensure meets the need


    • Delivers VALUE and solves problems

    View Slide

  68. Outcomes


    • Comfort in quality and value of system


    • New ideas for more feature work

    View Slide

  69. Ship it and
    go home!

    View Slide

  70. Wrapping
    Up

    View Slide

  71. Push conversations


    EARLY

    View Slide

  72. Answer


    “Why are we building this?”


    “What’s important to know?”

    View Slide

  73. Be lazy:


    Do the right level of work

    View Slide

  74. Shi
    p

    Grea
    t

    Stuff

    View Slide

  75. THANK YOU

    View Slide