Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Small Controlled Experiments

Small Controlled Experiments

http://verraes.net

The project was of to a bad start: an inherited legacy codebase, a waterfall contract, and a projected loss. The promise of Kaizen or Continuous Improvement seemed very appealing. But when we tried to incorporate this into our process, it didn’t catch on. Biweekly retrospectives didn’t seem to expose any problems we could improve upon. The ceremonies we tried, like Deming’s Plan-Do-Check-Act cycles, added too much overhead. We were doing something wrong.

Continuous Improvement implies that you know exactly where to focus your efforts. Like scientists, we started to experiment, without deciding upfront what we expected the outcome to be. The rules? Make every experiment as small as possible. No meetings, no consensus, no cumbersome evaluation process. We let the results speak for themselves. This talk explores the successes and failures of a team that went from survival mode to learning mode over the course of a year.

330627d5f564b710721236077903ed60?s=128

Mathias Verraes

October 03, 2014
Tweet

Transcript

  1. small controlled experiments @mathiasverraes

  2. small uncontrolled experiments @mathiasverraes

  3. Mathias Verraes Independent Consultant Value Object Comm.V Student of Systems

    Meddler of Models Labourer of Legacy verraes.net
  4. Fitness landscape

  5. None
  6. Continuous Improvement

  7. "When changing teams or organizations, the trick is not to

    try and push them out of their current behavior. (...) A better idea is to change parameters in the environment so that their current situation becomes unstable and disappears all by itself." 4 Jurgen Appelo6 6 Management 3.0: Leading Agile Developers, Developing Agile Leaders
  8. None
  9. Heavyweight

  10. Retrospectives are too slow

  11. Unproductive pressure to improve

  12. None
  13. Daily Two minutes, after standup

  14. Brainstorm rules "Yes, and... " Divergence Convergence7 7 "Thinking in

    New Boxes", Alan Iny & Luc de Brabandere
  15. Avoid upfront consensus "A meeting is where ideas go to

    die" Experiments over opinions
  16. "If we have data, let's look at data. If all

    we have are opinions, let's go with mine." 4 Jim Barksdale
  17. Avoid upfront expectations Expectations determine outcomes

  18. Low impact Small, cheap, reversible, low-risk

  19. No backlog Backlogs kill motivation

  20. Timeline Stickies

  21. Guarantee veto Everybody must be heard

  22. Measure selectively & intentionally Avoid optimising for the metrics

  23. Accept uncertainty Non-scientific Exposes invisible problems

  24. Accept gut feeling Emotional response is fine

  25. Accept failed experiments Welcome failures as new data points

  26. Kaizen Mind The urgency to improve

  27. Climate of Doubt Assume everything is broken and fixable

  28. "If an idea is obviously bad, find a quick way

    to test it, because if it's not bad, then it's really interesting." 4 Kent Beck
  29. Experiment Deliver one story a day

  30. Experiment Atomically scoped stories

  31. Experiment Start every story in pair

  32. Experiment Testers deploy independently

  33. Experiment Core Protocols

  34. Experiment Syncing physical boards

  35. If it's not on a wall or a board, it's

    not visual.
  36. Experiment Measure by hand

  37. Experiment Hide the estimate from the board

  38. Experiment No more sprint deadlines

  39. Experiment No interrupts after lunch

  40. Experiment Vizualize cost of interrupts

  41. Experiment Wall of Technical Debt8 8 http://verraes.net/2013/07/managed-technical-debt/

  42. None
  43. Use experiments to detect problems

  44. None
  45. @mathiasverraes http://verraes.net/2014/03/small-controlled-experiments/ http://verraes.net/workshops