Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Devopsdays Ghent 2014: Cognitive Biases in Tech: Awareness of our own bugs in decision making

Devopsdays Ghent 2014: Cognitive Biases in Tech: Awareness of our own bugs in decision making

Working in technology, whether that be part of a DevOps team, bug triaging on an open source project or just working as part of a team requires frequent judgement calls and decisions, often with limited information available, and under strong time pressures. These are the situations in which errors and missteps can easily arise due to our common cognitive biases and heuristics.

Why do we perceive a rise in error rate differently to an equivalent drop in availability? Why do we overestimate the chance of memorable disasters and successes recurring? Why do we tend to construct post-mortems as a sequence of predictable events even if they weren’t?

In this talk we’ll be covering the cognitive biases behind these questions and others, along with relevant examples from open source project work, operational incidents in the life of a sysadmin and various other technology situations.

Nigel Kersten is the CIO at Puppet Labs, and is responsible for the technical operations and business optimization teams there. He came to Puppet Labs from Google HQ in Mountain View where he was responsible for the design and implementation of one of the larger Puppet deployments in the world and was an active member of the Puppet open source community. Before that he was a senior sysadmin at a Sydney university, and he's been working with Linux since floppy disks actually mattered.

He's had a long standing interest in cognitive biases and behavioral economics since studying philosophy of mind at university.

Nigel Kersten

October 27, 2014
Tweet

More Decks by Nigel Kersten

Other Decks in Technology

Transcript

  1. Cognitive Biases in Tech: Awareness of our own bugs in

    decision making Nigel Kersten CIO, VP Operations at Puppet Labs Devopsdays Ghent, Oct 2014
  2. What are Cognitive Biases anyway? • Systematic errors in how

    we process and interpret information with an apparently irrational result • Often due to heuristics that simplify information processing • Result from a deficiency in our thinking rather than a logical fallacy
  3. The Xerox Bad Copy Scanning Bug • Discovered by David

    Kriesel in 2013 • 8 years old… • http://www.dkriesel.com/en/ blog/2013/0802_xerox- workcentres_are_switching_writ ten_numbers_when_scanning
  4. Original Copy “If using pattern matching upon compression there is

    no guarantee that parts of the scanned image actually come from the corresponding place on the paper.”
  5. Two Systems • System 1 • Fast • Instinctive, automatic

    • Emotional • Subconscious • System 2 • Slower, requires effort • Deliberate • Logical • Lazy, easily exhausted • Conscious
  6. The Map is not the Territory "MagrittePipe". Via Wikipedia -

    http://en.wikipedia.org/wiki/ File:MagrittePipe.jpg#mediaviewer/File:MagrittePipe.jpg
  7. System 1 and System 2 2 + 2 = ??

    2 ^ 8 = ?? 637 x 213 = ??
  8. System 1 and System 2 LEFT left right RIGHT RIGHT

    left LEFT right upper lower LOWER upper UPPER lower LOWER upper 1. Go down both columns, silently call out whether each word is in lowercase or uppercase by saying “upper” or “lower”. 2. Repeat, but this time call out whether each word is to the left or the right of the center by saying “left” or “right”.
  9. Our intuitive grasp of probability and statistics sucks "Levy distributionPDF"

    by User:PAR - Own work. Licensed under Public domain via Wikimedia Commons - http://commons.wikimedia.org/wiki/ File:Levy_distributionPDF.png#mediaviewer/ File:Levy_distributionPDF.png
  10. The “Linda Problem” Picture a woman named Linda. Linda is

    thirty-one years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in antinuclear demonstrations. Which alternative is more probable? 1. Linda is a bank teller. 2. Linda is a bank teller and is active in the feminist movement.
  11. The “Linda Problem” Picture a woman named Linda. Linda is

    thirty-one years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in antinuclear demonstrations. Which alternative is more probable? 1. Linda is a bank teller. 2. Linda is a bank teller and is active in the feminist movement. ! 85% of respondents chose (2).
  12. The question to be answered Linda is a bank teller

    Linda is a bank teller and an active feminist
  13. What System 1 probably told System 2 to answer Linda

    is a bank teller and an active feminist Linda is a bank teller and is not an active feminist
  14. This feels different • New job, and you’re responsible for

    a critical web application, standard LAMP stack. MySQL is on a shared database server that has gone down almost daily for the last few weeks. You’re about to go on-call for the first time, and are preparing. • Which is more probable? 1. The web application will be unavailable 2. The web application will fail due to a MySQL outage
  15. Intuition and Probability A study of the incidence of kidney

    cancer in the 3,141 counties of the United States reveals a remarkable pattern. The counties in which the incidence of kidney cancer is lowest are mostly rural, sparsely populated, and located in traditionally Republican states in the Midwest, the South, and the West. Why?
  16. Why? Possibly because of the clean lifestyle in rural areas.

    Fresh food, clean air, and lower all-round pollution. ! ! ! "MaryJane harvests corn" by MaryJane Butters - Licensed under Creative Commons Attribution-Share Alike 3.0 via Wikimedia Commons - http://commons.wikimedia.org/wiki/ File:MaryJane_harvests_corn.jpg#mediaviewer/ File:MaryJane_harvests_corn.jpg
  17. Intuition and Probability A study of the incidence of kidney

    cancer in the 3,141 counties of the United States reveals a remarkable pattern. The counties in which the incidence of kidney cancer is highest are mostly rural, sparsely populated, and located in traditionally Republican states in the Midwest, the South, and the West.
  18. Why? Possibly because of the poverty in rural areas. Poor

    access to medical care, lower education levels. ! ! "Poor mother and children, Oklahoma, 1936 by Dorothea Lange" by Dorothea Lange - Library of Congress LC- USF34- 009694-E. Licensed under Public domain via Wikimedia Commons - http://commons.wikimedia.org/wiki/ File:Poor_mother_and_children,_Oklahoma,_1936_by_Dorot hea_Lange.jpg#mediaviewer/ File:Poor_mother_and_children,_Oklahoma,_1936_by_Dorot hea_Lange.jpg
  19. How did we get here? • rural • sparsely populated

    • located in traditionally Republican states
  20. How did we get here? • rural • sparsely populated

    • located in traditionally Republican states
  21. How did we get here? • rural • sparsely populated

    • located in traditionally Republican states
  22. How did we get here? • rural • sparsely populated

    • located in traditionally Republican states
  23. How did we get here? • rural • sparsely populated

    • located in traditionally Republican states
  24. We are not intuitively good at statistics 1. Large samples

    are more precise than small samples 2. Small samples yield more extreme results more often than large samples do
  25. We are not intuitively good at statistics 1. Large samples

    are more precise than small samples 2. Small samples yield more extreme results more often than large samples do ! (1) is well known and feels obvious. (2) is often not.
  26. Small Sample Size Example You’re hiring for a new job

    at work, with internal and external applicants, and have just received the feedback from all the interviewers, who are current employees. Two applications got overall positive feedback from the six interviewers. 1. One of your current employees got reasonably positive feedback from all interviewers. 2. An external candidate who none of the interviewers have ever worked with got absolutely outstanding feedback from two interviewers and neutral feedback from the other four.
  27. Anchoring Effect When we consider a specific value for an

    unknown quantity before estimating it, our estimation is heavily influenced by the specific value.
  28. Anchoring Example 1. Is the height of the tallest redwood

    more or less than 1,200 feet? • What is your best guess about the height of the tallest redwood? vs 2. Is the height of the tallest redwood more or less than 180 feet? • What is your best guess about the height of the tallest redwood?
  29. Anchoring Example 1. Is the height of the tallest redwood

    more or less than 1,200 feet? • What is your best guess about the height of the tallest redwood? vs 2. Is the height of the tallest redwood more or less than 180 feet? • What is your best guess about the height of the tallest redwood? ! 1. Mean Answer: 844 feet 2. Mean Answer: 282 feet
  30. Anchoring Effect • One of the strongest, most repeatable effects

    in experimental psychology • Doesn’t matter whether you believe the initial information is relevant or not • Doesn’t matter how motivated you are to produce a correct estimate • Experts are still susceptible, although more resistant • Inconclusive whether smarter people are less susceptible
  31. Anchoring Effect relevance • We estimate all the time in

    operations and development • System Performance • SLA design • Monitoring checks • Purchasing software and hardware • Project Planning
  32. Biases in Postmortems "Various scalpels". Licensed under Public domain via

    Wikimedia Commons - http:// commons.wikimedia.org/wiki/ File:Various_scalpels.png#mediaviewer/ File:Various_scalpels.png
  33. Biases in Postmortems • Hindsight Bias • We see events

    as predictable after they have occurred, regardless of whether they were or not at the time. • Outcome Bias • Our assessment of actions is heavily affected by the consequence of those actions. • Availability Heuristic • We consider easily recalled information to be more important. • Fundamental Attribution Error • We place too much emphasis on people’s internal characteristics rather than external factors when trying to explain their behavior.
  34. Mitigating intuitive probability • Work with frequencies rather than probabilities

    • Think diagrammatically • Internalize that subsets of random data will contain predictable looking sequences
  35. Mitigating intuitive estimates 1. Develop baselines for metric estimates regardless

    of evidence 2. Perform intuitive estimate based on evidence 3. Estimate correlation percentage between evidence and metric 4. Modify baseline by (correlation x estimate)
  36. Ego Depletion - System 2 gets tired easily • Make

    critical decisions early in the day • Minimize unimportant decisions early in the day • Improve your mood • Doesn’t improve the ability of System 2 to function • Doesn’t slow down ego depletion • Does counteract ego depletion
  37. Mitigation for postmortems • Hindsight Bias • Record predictions prior

    to results, review after results • Outcome Bias • Focus on and reward quality of judgements, not outcomes • Availability Heuristic • Examine the data that will be used to make a decision before making it
  38. Mitigating the Anchoring Effect • Role-play opponent’s moves when negotiating

    • Devil’s Advocate • Develop more expertise • But… it’s a really robust effect
  39. Thanks! - References and Attributions “Thinking, Fast and Slow” -

    Daniel Kahneman, ISBN: 9780374275631 “The Human Side of Postmortems” - Dave Zwieback, ASIN: B00CLH38CM “Extraneous factors in judicial decisions” - Danzigera, Levavb, Pessoa. http://www.pnas.org/content/108/17/6889.full.pdf ! ! ! ! ! ! All images licensed for reuse without attribution or memes by unknown authors unless otherwise noted.