$30 off During Our Annual Pro Sale. View Details »

Mayday! Software Lessons from Aviation Disasters

Mayday! Software Lessons from Aviation Disasters

Adele Carpenter

June 16, 2021
Tweet

More Decks by Adele Carpenter

Other Decks in Programming

Transcript

  1. Mayday!
    Adele Carpenter
    Software lessons from aviation disasters
    @iam_carpenter

    View Slide

  2. Photo by Graham Dinsdale via Jetphotos

    View Slide

  3. Approximation of cabin
    Photo by Usien via wikipedia commons

    View Slide

  4. Photo by Nick Bair via wikipedia commons

    View Slide

  5. Chain of events
    Partnair 394

    View Slide

  6. Photo by Chandler Cruttenden on Unsplash
    F-16 sonic boom?
    Lockerbie bombing Part II?
    Photo via wikimedia commons

    View Slide

  7. Finn gathers the team...
    • How can we prove ourselves wrong?
    • What details might we be ignoring?
    • What else could cause a mid-air break up?
    • What are the precise levels? Other sources of residue?
    • No terrorist groups have claimed credit
    • Countless possibilities, need to narrow the scope

    View Slide

  8. Photo by Chandler Cruttenden on Unsplash
    F-16 sonic boom?
    Lockerbie bombing Part II?
    Photo via wikimedia commons

    View Slide

  9. Photo by Ana Municio on Unsplash

    View Slide

  10. Question yourself
    • How can I prove myself wrong?
    • What details might I be ignoring because it doesn’t fit my
    theory or solution?
    • What else could cause this issue or situation?

    View Slide

  11. Photo courtesy of BEA
    Photo courtesy of NTSB

    View Slide

  12. View Slide

  13. Photo by Brian Gordon via picryl

    View Slide

  14. Photo by Nick Bair via wikipedia commons

    View Slide

  15. View Slide

  16. Photo by Brian Gordon via picryl

    View Slide

  17. Chain of events
    Partnair 394

    View Slide

  18. Counterfeit Code?

    View Slide

  19. Improving software quality
    Finn’s top tips
    • Outcomes as a chain of events to learn from
    • Taking our time to investigate all options
    • Accepting responsibility our code - no “counterfeit code”
    • Following the evidence
    • Questioning ourselves rigorously

    View Slide

  20. View Slide

  21. Photo by Dan Parlante on Unsplash

    View Slide

  22. Approximation of actual cockpit
    Photo by Keishi Nukina, KNaviation
    Captain McBroom
    First Officer
    Beebee Second Officer
    Frosty
    DC-8-61 Cockpit
    28 December 1978
    UA 173

    View Slide

  23. Approximation of actual cabin
    Courtesy of Runway Girl Network

    View Slide

  24. Approximation of actual cockpit
    Photo by Keishi Nukina, KNaviation
    Captain McBroom
    First Officer
    Beebee Second Officer
    Frosty
    DC-8-61 Cockpit
    28 December 1978
    UA 173

    View Slide

  25. Photo by Randy Peters on Unsplash

    View Slide

  26. Courtesy of the Oregonian

    View Slide

  27. The cause was the Captain’s failure to properly monitor the
    aircraft’s fuel state and to respond to both the low fuel state
    and the crew-member’s advisories regarding the fuel state.
    The factor contributing to the accident was the failure of
    Beebee and Frosty to either to fully comprehend the criticality
    of the fuel state or to successfully communicate their concern
    to the captain.

    View Slide

  28. Courtesy of The Flight Channel

    View Slide

  29. Crew Resource Management
    Recommendations after UA 173
    • Participative management for captains: in other words
    providing a non-threatening environment that actively
    encourages collaboration
    • Assertiveness training for other cockpit crew members: how
    to get your point across effectively in a non-threatening
    manner

    View Slide

  30. Courtesy of the Oregonian

    View Slide

  31. Situational Awareness
    “Knowing what’s going on around you”
    Fixation -> loss of situational awareness

    View Slide

  32. Situational Awareness
    Contributing Factors
    • Knowledge of the Automation and Abstractions present
    • Are you aware of the abstractions that you interact with, even
    if on a rudimentary level? Could you find the resources you
    need if the situation required it? Do you have a basic
    understanding of what happens when you run a Jenkins build?
    What’s it doing and what could go wrong?
    • Stress and Workload
    • How many tickets are assigned to you right now? How many
    meetings do you have planned? Which deadlines are real and
    which are arbitrary? Be honest with yourself and others on
    what you can accomplish

    View Slide

  33. Situational Awareness
    Contributing Factors
    • Physiological factors
    • Did you sleep enough? Are you hydrated? Did you eat? Did you poop?
    • Abilities, Experience, Training
    • What are your limitations or weak spots with respect to your role? How
    can you patch them? Who can help you? What knowledge could you share
    with others?
    • Preconceptions and bias
    • Is your gut instinct actually just bias? Why does a certain course of
    action appeal to you? What evidence do you have to support your
    position? Evidence against?

    View Slide

  34. Situational Awareness
    Is not a solo endeavour!
    • Contribute to a psychologically safe work environment
    • Share your knowledge, no matter your experience level
    • Speak up when things don’t add up

    View Slide

  35. Thank you
    Adele Carpenter
    @iam_carpenter

    View Slide