Upgrade to Pro — share decks privately, control downloads, hide ads and more …

More Better Quality Coverage

More Better Quality Coverage

I've given this talk many times. This deck is from a talk at DevReach 2018. You can see the video here: https://youtu.be/dk-gLlpYb3o

A talk on getting great quality coverage--testing the right things at the right time, and how to think about properly splitting off automated versus exploratory testing.

Jim Holmes

May 05, 2017
Tweet

More Decks by Jim Holmes

Other Decks in Programming

Transcript

  1. • Customer needs • Feature fi t • Solves the

    scenarios we know about • Secure • Performant • Works well in production • Plays nicely with our craptistic data
  2. Release Planning => Are scope and basic function right? What

    integrations? What User Acceptance criteria? What scenarios and data needed?
  3. Iteration Planning => Do we know “Why?” Do we know

    (sort of) “How?” Good acceptance criteria? Good data?
  4. System must take in to account • Model production rules

    • Inventory rules • Capacity restrictions
  5. Stakeholder : “I want to create, edit and view future

    years’ model con fi gs . I want to use it on web and mobile.”
  6. • Riskiest part of biz idea? • Biggest value for

    biz idea? • Customer audience? • Target platform(s)? • Integrations with existing systems? • Architectural impacts?
  7. “Our basic con fi g system’s data access is unstable,

    and we have data consistency/accuracy errors.”
  8. “You’re asking for a wide range of mobile device support—that

    explodes our testing and development effort.”
  9. “You said you want to scale to support concurrent access

    by all of China. “We currently have six people who do this task.”
  10. Considerations • What platforms? • What’s reasonable load? • How

    secure? • What’s biz value? • What happens if info is leaked?
  11. Initial design ideas • Use central data mart • Pull

    existing inventory data • Kendo UI for grid
  12. Considerations • Business scenario s • Acceptance criteria • Infrastructure

    / integration points • Data and environment needs • Performance needs • Security
  13. Outcomes • Concerns of perf on client systems • NOT

    testing Kendo Grid • Signi fi cant test data requirements • Comfortable with security approach
  14. • Why are we building this? • Do we have

    test data yet? • Environments ready?
  15. Outcomes • Most, not all, test data ready • What’s

    not ready can be tested later • Dependencies in place • Good to move forward
  16. Dev-Tester Collaboration Example • “This use case isn’t clear!” •

    “What about this edge case?” • “Now I understand REST!”
  17. Considerations • What isn’t clear? • Did we miss something?

    • Tests in the right place • Integration, Unit, JS in UI, functional
  18. Outcomes • Little or no overlap of tests • New

    use cases discovered, resolved • Added to test data • BUILT AND SHIPPED WORKING STUFF!
  19. UAT

  20. Total focus on Quality • No wasted effort • UAT

    focuses only on gaps • Earlier efforts pay off