Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Measuring Quality in Software

Measuring Quality in Software

JapanTestCommunity

May 03, 2024
Tweet

Transcript

  1. Thomas Santonja, VP of Engineering About me: ⬢ 17 years

    in software development ⬢ 11 year in leadership ⬢ Regulated B2B (healthcare) ⬢ B2C (e-commerce) - Managing the QA dept ⬢ B2B (consulting and SaaS)
  2. Today’s focus - A chronologically sequential process in building measurement

    along with best practices Intro & Definition Software quality “the degree to which software meets specified requirements and user expectations” Measurement “the process of quantitatively assessing specific attributes or characteristics of software using defined metrics or criteria.” Data-driven decision-making “the process of using data to inform your decision-making process and validate a course of action before committing to it.” Intuition “the ability to understand something instinctively, without the need for conscious reasoning.”
  3. Positive ROI, that’s why Why ensuring software quality? Usual suspects

    Identify defects early in the development process Ensure compliance with standards Enhance customer satisfaction ROI % = Cost of Late Fix - Cost of Early Detection Cost of Early Detection X 100 Looks easy, not passing regulation = no product = 0% revenue After the regulation is passed extra quality consideration uses the same ROI as above Same as the first for preventive quality It’s own complex formula for reactive quality
  4. Identify defects early in the development process Breaking down the

    simplest one ROI % = Cost of Late Fix - Cost of Early Detection Cost of Early Detection X 100 Cost of late fix (simplified example): ⬢ Average cost estimation of fix after deployment ⬢ Average cost estimation of fix during development ⬢ Estimation of the cost of an issue hitting production on the product revenue ⬢ Estimation of the cost of downtime ⬢ Sum of all cost - cost of early fix What period of time is relevant, do you just guess? Does your team input the time, do you just guess? Is your revenue, marketing or customer support team capable of estimating that impact, where is that data coming from? Do you have actual uptime statistics to infer this value? If some of those info is missing, should you invest in acquiring them? What is the ROI for it?
  5. Value of measuring quality “Measuring leads to compliance, trusting leads

    to innovation” A wise man, maybe Some examples ⬢ Changing behaviour ⬢ Removing biases (especially your own) ⬢ Negotiating budgets ⬢ Backing evaluation arguments ⬢ Support fact based decision making ⬢ Getting yourself promoted (you never know!) ⬢ Anything you need in your specific situation
  6. Usual Metrics Why? Cheatable? Easy to report? Defect Density Failure

    Rate Code Complexity Test Coverage Mean Time Between Failures Identify most complex codebase Eventually reducing bugs escapes Under report bugs No - Unit of software size is not easy and likely inconsistent across teams Identify most complex codebase Eventually reducing bugs escapes Under report bugs No - Unit of software size is not easy and likely inconsistent across teams Identify most complex codebase Eventually reducing maintenance costs Not really, easy to ignore though Yes - all languages can report basic metrics with ease Identify less secure pipelines Eventually reducing bugs escapes Yes especially for unit tests Functional tests highly depending on expertise Yes - unit test coverage is built in most IDE No - functional coverage is complex Identify product with the lower quality Eventually increasing stability Under report bugs Under report failure duration No - depending on human input
  7. Challenges and limitations Time to implement ! Usefulness “Those metrics

    are useless” Cost “This is too hard”, “we can’t prioritize reporting” Accuracy “The data is inaccurate, leading to incorrect decisions” Personal feelings “Those report will be used against me, and they are not representative of the value I provide here” “This will motivate others to only focus on meeting that specific metric and not try to do their best” “Why are you measuring this, you do not trust me?”
  8. Solutions and Mitigations - Communication Usefulness Do I see how

    that metric will help drive change? Can I convince others? Redesign Share with everyone Challenge solved Yes Yes No No Keep it to yourself
  9. Solutions and Mitigations - Implementation cost Costs Purchase approach Custom

    approach ⬢ Investigate / select tooling ⬢ Integrate tool ⬢ Tool cost ⬢ Lower implementation cost ⬢ Low quality output ⬢ Data will be cheated on or inaccurate ⬢ Design metric(s) ⬢ Change process or input ⬢ Aggregate data ⬢ Triage, filter ⬢ Build the report / dashboard ⬢ Higher implementation cost ⬢ High quality output ⬢ Valuable metric
  10. Human mistake Solutions and Mitigations - Data accuracy Accuracy Root

    causes Solution(s) Automate / Enforce Cheating Incentivized input
  11. Solutions and Mitigations - Motivation and retention Feelings System data

    Change management ⬢ Service up time ⬢ Customer raised tickets ⬢ Company revenue ⬢ Communication ⬡ Transparency ⬡ Trust ⬡ Honesty ⬢ Measure teams or dept ⬢ Politics
  12. Summary What problem to solve? Do you need metrics? Design

    your metric Build data pipeline Communicate Profit
  13. Case Study - Time to reaction What problem to solve?

    Do you need metrics? Design your metric Build data pipeline Communicate Profit Concern about our slow response time, especially for first investigation and estimates The solution should be team specific, and thus driving the solution through team result Establish SLO per severity, report the % of tickets above SLO per severity Record time of creation, time of first review by the dev team, add all filter necessary Explain the problem, the aim, the metric and the change in process to record the data Use it to measure the result of team practice changes, and discuss other ideas.
  14. Q & A Most icons used in this presentation, outside

    of the bee icon from Autify comes from Flaticon.com