Upgrade to Pro — share decks privately, control downloads, hide ads and more …

More Better Quality Coverage

More Better Quality Coverage

I've given this talk many times. This deck is from a talk at DevReach 2018. You can see the video here: https://youtu.be/dk-gLlpYb3o

A talk on getting great quality coverage--testing the right things at the right time, and how to think about properly splitting off automated versus exploratory testing.


Jim Holmes

May 05, 2017

More Decks by Jim Holmes

Other Decks in Programming


  1. More Better Quality Coverage `

  2. Jim Holmes @aJimHolmes FrazzledDad.com jim@GuidepostSystems.com

  3. Deck: SpeakerDeck.com/JimHolmes

  4. Goals: Better conversations Better Quality Coverage BETTER STUFF!

  5. Who “assures quality” ?

  6. The Stakeholder

  7. What’s “Quality”

  8. “Quality is something of value to someone.” —Jerry Weinberg

  9. What’s “Quality”

  10. • Customer needs • Feature fi t • Solves the

    scenarios we know about • Secure • Performant • Works well in production • Plays nicely with our craptistic data
  11. “Please, God, make it stop hurting so bad.”

  12. Cut Risk

  13. No lawsuits

  14. “Quality” also means “Enough information to make my next decision.”

  15. Business Quality is not the same as Technical Quality

  16. Where do conversations about Quality happen?


  18. Ideation Release Planning/Design Iteration Planning Building UAT

  19. Ideation => Validate assumptions “Should we build this?”

  20. Release Planning => Are scope and basic function right? What

    integrations? What User Acceptance criteria? What scenarios and data needed?
  21. Iteration Planning => Do we know “Why?” Do we know

    (sort of) “How?” Good acceptance criteria? Good data?
  22. Building => Pair up More test scenarios? Clarity of value

    and “how?”
  23. UAT => “Here’s all the stuff we know’s tested. Let

    me go gonzo.”
  24. The Thing

  25. Appliance manufacturer

  26. Wants to create model con fi gurations for future years.

    Currently done on paper. Or Excel.
  27. e.g. foreach model: how many units do I build with

    speci fi c con fi gurations?
  28. None
  29. System must take in to account • Model production rules

    • Inventory rules • Capacity restrictions
  30. Risk: Signi fi cant revenue loss if ratios are off.

  31. Ideation

  32. Stakeholder : “I want to create, edit and view future

    years’ model con fi gs . I want to use it on web and mobile.”
  33. Help business understand total cost (actual plus opportunity)

  34. Is this feasible, Or should we just start drinking now?

  35. Quality feedback for Ideation

  36. • Riskiest part of biz idea? • Biggest value for

    biz idea? • Customer audience? • Target platform(s)? • Integrations with existing systems? • Architectural impacts?
  37. “Our basic con fi g system’s data access is unstable,

    and we have data consistency/accuracy errors.”
  38. “You’re asking for a wide range of mobile device support—that

    explodes our testing and development effort.”
  39. “You said you want to scale to support concurrent access

    by all of China. “We currently have six people who do this task.”
  40. Desired outcome: Stakeholder makes informed decision

  41. Load and Security are business value concerns

  42. Considerations • What platforms? • What’s reasonable load? • How

    secure? • What’s biz value? • What happens if info is leaked?
  43. Outcomes • Drop mobile • Use existing security • Reasonable

    load expectations
  44. Release Planning & Design

  45. At this point we’re starting to think about what and

  46. Initial design ideas • Use central data mart • Pull

    existing inventory data • Kendo UI for grid
  47. Elaborate and specify UAT criteria

  48. Discuss infrastructure needs

  49. Mitigate known issues

  50. Considerations • Business scenario s • Acceptance criteria • Infrastructure

    / integration points • Data and environment needs • Performance needs • Security
  51. Outcomes • Concerns of perf on client systems • NOT

    testing Kendo Grid • Signi fi cant test data requirements • Comfortable with security approach
  52. Iteration Planning

  53. We’re Gonna Do Some Stuff!

  54. Do we know enough to start building and testing?

  55. Timeline / dependencies for this iteration

  56. • Why are we building this? • Do we have

    test data yet? • Environments ready?
  57. Considerations • Test data • Scenarios • Automated vs. exploratory

  58. Outcomes • Most, not all, test data ready • What’s

    not ready can be tested later • Dependencies in place • Good to move forward
  59. Building

  60. Let’s Build Stuff!

  61. Advantages of Dev/Tester Pairing: • Immediate feedback • Shared knowledge

    • Cross pollination • Great test coverage
  62. Dev-Tester Collaboration Example • “This use case isn’t clear!” •

    “What about this edge case?” • “Now I understand REST!”
  63. Considerations • What isn’t clear? • Did we miss something?

    • Tests in the right place • Integration, Unit, JS in UI, functional
  64. Outcomes • Little or no overlap of tests • New

    use cases discovered, resolved • Added to test data • BUILT AND SHIPPED WORKING STUFF!
  65. UAT

  66. Total focus on Quality • No wasted effort • UAT

    focuses only on gaps • Earlier efforts pay off
  67. Considerations • Understanding of test coverage • Ensure meets the

    need • Delivers VALUE and solves problems
  68. Outcomes • Comfort in quality and value of system •

    New ideas for more feature work
  69. Ship it and go home!

  70. Wrapping Up

  71. Push conversations EARLY

  72. Answer “Why are we building this?” “What’s important to know?”

  73. Be lazy: Do the right level of work

  74. Shi p Grea t Stuff