Slide 1

Slide 1 text

Mayday! Adele Carpenter Software lessons from aviation disasters @iam_carpenter

Slide 2

Slide 2 text

Photo by Graham Dinsdale via Jetphotos

Slide 3

Slide 3 text

Approximation of cabin Photo by Usien via wikipedia commons

Slide 4

Slide 4 text

Photo by Nick Bair via wikipedia commons

Slide 5

Slide 5 text

Chain of events Partnair 394

Slide 6

Slide 6 text

Photo by Chandler Cruttenden on Unsplash F-16 sonic boom? Lockerbie bombing Part II? Photo via wikimedia commons

Slide 7

Slide 7 text

Finn gathers the team... • How can we prove ourselves wrong? • What details might we be ignoring? • What else could cause a mid-air break up? • What are the precise levels? Other sources of residue? • No terrorist groups have claimed credit • Countless possibilities, need to narrow the scope

Slide 8

Slide 8 text

Photo by Chandler Cruttenden on Unsplash F-16 sonic boom? Lockerbie bombing Part II? Photo via wikimedia commons

Slide 9

Slide 9 text

Photo by Ana Municio on Unsplash

Slide 10

Slide 10 text

Question yourself • How can I prove myself wrong? • What details might I be ignoring because it doesn’t fit my theory or solution? • What else could cause this issue or situation?

Slide 11

Slide 11 text

Photo courtesy of BEA Photo courtesy of NTSB

Slide 12

Slide 12 text

No content

Slide 13

Slide 13 text

Photo by Brian Gordon via picryl

Slide 14

Slide 14 text

Photo by Nick Bair via wikipedia commons

Slide 15

Slide 15 text

No content

Slide 16

Slide 16 text

Photo by Brian Gordon via picryl

Slide 17

Slide 17 text

Chain of events Partnair 394

Slide 18

Slide 18 text

Counterfeit Code?

Slide 19

Slide 19 text

Improving software quality Finn’s top tips • Outcomes as a chain of events to learn from • Taking our time to investigate all options • Accepting responsibility our code - no “counterfeit code” • Following the evidence • Questioning ourselves rigorously

Slide 20

Slide 20 text

No content

Slide 21

Slide 21 text

Photo by Dan Parlante on Unsplash

Slide 22

Slide 22 text

Approximation of actual cockpit Photo by Keishi Nukina, KNaviation Captain McBroom First Officer Beebee Second Officer Frosty DC-8-61 Cockpit 28 December 1978 UA 173

Slide 23

Slide 23 text

Approximation of actual cabin Courtesy of Runway Girl Network

Slide 24

Slide 24 text

Approximation of actual cockpit Photo by Keishi Nukina, KNaviation Captain McBroom First Officer Beebee Second Officer Frosty DC-8-61 Cockpit 28 December 1978 UA 173

Slide 25

Slide 25 text

Photo by Randy Peters on Unsplash

Slide 26

Slide 26 text

Courtesy of the Oregonian

Slide 27

Slide 27 text

The cause was the Captain’s failure to properly monitor the aircraft’s fuel state and to respond to both the low fuel state and the crew-member’s advisories regarding the fuel state. The factor contributing to the accident was the failure of Beebee and Frosty to either to fully comprehend the criticality of the fuel state or to successfully communicate their concern to the captain.

Slide 28

Slide 28 text

Courtesy of The Flight Channel

Slide 29

Slide 29 text

Crew Resource Management Recommendations after UA 173 • Participative management for captains: in other words providing a non-threatening environment that actively encourages collaboration • Assertiveness training for other cockpit crew members: how to get your point across effectively in a non-threatening manner

Slide 30

Slide 30 text

Courtesy of the Oregonian

Slide 31

Slide 31 text

Situational Awareness “Knowing what’s going on around you” Fixation -> loss of situational awareness

Slide 32

Slide 32 text

Situational Awareness Contributing Factors • Knowledge of the Automation and Abstractions present • Are you aware of the abstractions that you interact with, even if on a rudimentary level? Could you find the resources you need if the situation required it? Do you have a basic understanding of what happens when you run a Jenkins build? What’s it doing and what could go wrong? • Stress and Workload • How many tickets are assigned to you right now? How many meetings do you have planned? Which deadlines are real and which are arbitrary? Be honest with yourself and others on what you can accomplish

Slide 33

Slide 33 text

Situational Awareness Contributing Factors • Physiological factors • Did you sleep enough? Are you hydrated? Did you eat? Did you poop? • Abilities, Experience, Training • What are your limitations or weak spots with respect to your role? How can you patch them? Who can help you? What knowledge could you share with others? • Preconceptions and bias • Is your gut instinct actually just bias? Why does a certain course of action appeal to you? What evidence do you have to support your position? Evidence against?

Slide 34

Slide 34 text

Situational Awareness Is not a solo endeavour! • Contribute to a psychologically safe work environment • Share your knowledge, no matter your experience level • Speak up when things don’t add up

Slide 35

Slide 35 text

Thank you Adele Carpenter @iam_carpenter