More Better Quality Coverage

More Better Quality Coverage

From StirTrek 2017.

A talk on getting great quality coverage--testing the right things at the right time, and how to think about properly splitting off automated versus exploratory testing.

Fd80f9c58b06270d42356dd77a32defa?s=128

Jim Holmes

May 05, 2017
Tweet

Transcript

  1. More Better Quality Coverage

  2. Jim Holmes @aJimHolmes FrazzledDad.com jim@GuidepostSystems.com

  3. Deck: SpeakerDeck.com/JimHolmes

  4. Goals: Better conversations Better Quality Coverage BETTER STUFF!

  5. What’s “Quality”

  6. “Quality is something of value to someone.” —Jerry Weinberg

  7. • Customer needs • Feature fit • Solves the scenarios

    we know about • Secure • Performant • Works well in production • Plays nicely with our craptistic data
  8. “Please, God, make it stop hurting so bad.”

  9. Cut Risk

  10. No lawsuits

  11. “Quality” also means “Enough information to make my next decision.”

  12. Business Quality is not the same as Technical Quality

  13. Where do conversations about Quality happen?

  14. Which part of “ALL THE FREAKIN’ TIME” did you not

    understand?
  15. Ideation Release Planning/Design Iteration Planning Building UAT

  16. Ideation => Validate assumptions “Should we build this?”

  17. Release Planning => Are scope and basic function right? What

    integrations? What User Acceptance criteria? What scenarios and data needed?
  18. Iteration Planning => Do we know “Why?” Do we know

    (sort of) “How?” Good acceptance criteria? Good data?
  19. Building => Pair up More test scenarios? Clarity of value

    and “how?”
  20. UAT => “Here’s all the stuff we know’s tested. Let

    me go gonzo.”
  21. The Thing

  22. Appliance manufacturer

  23. Wants to create model configurations for future years. Currently done

    on paper. Or Excel.
  24. e.g. foreach model: how many units do I build with

    specific configurations?
  25. None
  26. System must take in to account • Model production rules

    • Inventory rules • Capacity restrictions
  27. Risk: Potential loss of MILLIONS of USD if ratios are

    off.
  28. Ideation

  29. Stakeholder: “I want to create, edit and view future years’

    model configs. I want to use it on web and mobile.”
  30. Help business understand total cost (actual plus opportunity)

  31. Is this feasible, Or should we just start drinking now?

  32. Quality feedback for Ideation

  33. • Riskiest part of biz idea? • Biggest value for

    biz idea? • Customer audience? • Target platform(s)? • Integrations with existing systems? • Architectural impacts?
  34. “Our basic config system’s data access is unstable, and we

    have data consistency/accuracy errors.”
  35. “You’re asking for a wide range of mobile device support—that

    explodes our testing and development effort.”
  36. “You said you want to scale to support concurrent access

    by all of China. “We currently have six people who do this task.”
  37. Desired outcome: Stakeholder makes informed go/ no-go decision

  38. Load and Security are business value concerns

  39. Considerations • What platforms? • What’s reasonable load? • How

    secure? • What’s biz value? • What happens if info is leaked?
  40. Outcomes • Drop mobile • Use existing security • Reasonable

    load expectations
  41. Release Planning & Design

  42. At this point we’re starting to think about what and

    how
  43. Initial design ideas • Use central data mart • Pull

    existing inventory data • Kendo UI for grid
  44. Elaborate and specify UAT criteria

  45. Discuss infrastructure needs

  46. Mitigate known issues

  47. Considerations • Business scenarios • Acceptance criteria • Infrastructure /

    integration points • Data and environment needs • Performance needs • Security
  48. Outcomes • Concerns of perf on client systems • NOT

    testing Kendo Grid • Significant test data requirements • Comfortable with security approach
  49. Iteration Planning

  50. We’re Gonna Do Some Stuff!

  51. Do we know enough to start building and testing?

  52. Timeline / dependencies for this iteration

  53. • Why are we building this? • Do we have

    test data yet? • Environments ready?
  54. Considerations • Test data • Scenarios • Automated vs. exploratory

    (initial)
  55. Outcomes • Most, not all, test data ready • What’s

    not ready can be tested later • Dependencies in place • Good to move forward
  56. Building

  57. Let’s Build Us Some Stuff!

  58. Advantages of Dev/Tester Pairing: • Immediate feedback • Shared knowledge

    • Cross polinization • Great test coverage
  59. Dev-Tester Collaboration Example • “This use case isn’t clear!” •

    “What about this edge case?” • “Sweet, now I understand REST!”
  60. Considerations • What isn’t clear? • Did we miss something?

    • Tests in the right place • Integration, Unit, JS in UI, functional
  61. Outcomes • New use case discovered, resolved • Added to

    test data • BUILT AND SHIPPED WORKING STUFF!
  62. UAT

  63. Total focus on Quality • No wasted effort • UAT

    focuses only on gaps • Earlier efforts pay off
  64. Considerations • Understanding of test coverage • Ensure meets the

    need • Delivers VALUE and solves problems
  65. Outcomes • Comfort in quality and value of system •

    New ideas for more feature work
  66. Ship it and go home!

  67. Wrapping Up

  68. Push conversations EARLY

  69. Answer “Why are we building this?” “What’s important to know?”

  70. Be lazy: Do the right level of work

  71. Ship Great Stuff

  72. THANK YOU

  73. 0) No SharePoint

  74. B) Hug your loved ones RIGHT NOW