Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Killer Robots and Rogue AI

Killer Robots and Rogue AI

When it comes to safety risks with technology, are we worrying about the right things?

Marianne Bellotti

September 16, 2022
Tweet

More Decks by Marianne Bellotti

Other Decks in Programming

Transcript

  1. About Me • Author of “Kill It With Fire” •

    20+ years of software experience • Specialities: ◦ System dynamics ◦ Applied formal methods ◦ Architecture and system rescue • Engineering manager at Rebellion Defense
  2. About Me • Author of “Kill It With Fire” •

    20+ years of software experience • Specialities: ◦ System dynamics ◦ Applied formal methods ◦ Architecture and system rescue • Engineering manager at Rebellion Defense I work in defense
  3. : /

  4. Real Government/Defense Tech • Old computers ◦ 2016: The DoD

    orders branches to upgrade to Windows 10 ◦ Many computers too old to support it ◦ 2018: "For the most part, with the exception of a couple [of agencies], we are there," Essye Miller, the Acting Department of Defense Chief Information Officer • Bad networks • Low connectivity • Data is old and in bad formats (PowerPoint files)
  5. Tech That People Have Safety Concerns About • Killer Robots

    • Drone Swarms • Spying Smart Home Devices • Deep Fakes • Smart Dust (microelectromechanical systems) • Quantum Computing
  6. Technologies That Actually Threaten You • Disruption to health care

    systems ◦ WannaCry (2017) ◦ NHS outage (2018) ◦ 911 outage 6,000 unanswered calls • Transportation chaos ◦ British Airways 2017, 1,000+ flights, 2019 300 flights ◦ Heathrow 2020, 600 flights ◦ Nissan Airbag sensor glitch ◦ Toyota Prius engine control glitch ◦ Tesla Autopilot ◦ Boeing 787 Dreamliner, 737 MAX • Missing money ◦ HSBC 2015, 275,000 payments ◦ RBS 2015 600,000 payments ◦ TBS 2018 millions • Misc ◦ 2015, 3,200 prisoners released early ◦ F-35 target detection failures • Linked to an increase in: ◦ Myopia ◦ Musculoskeletal issues ◦ Sleep problems ◦ Mental health
  7. Are We Concerned About the Right Things? • The greatest

    threat to safety isn’t the doomsday tech, it’s the failure of normal technology that is everywhere. • Feedback loops and context are important • Many engineers don’t think that anything they build could have safety consequences
  8. Are We Concerned About the Right Things? • The greatest

    threat to safety isn’t the doomsday tech, it’s the failure of normal technology that is everywhere. • Feedback loops and context are important • Many engineers don’t think that anything they build could have safety consequences Kind of my jam →
  9. Reasons to Think About Safety • Build more interesting and

    useful things! • Integrating the human experience into the technology • Find bugs and design flaws before you waste a lot of time/energy
  10. Traditional View -vs- New View Safety Traditional • Specification •

    Verification • Aggressive testing to determine probability of failure New View • Ergonomics (how people adapt) • Sociological impact • System Dynamics
  11. Traditional • Specification • Verification • Aggressive testing to determine

    probability of failure New View • Ergonomics (how people adapt) • Sociological impact • System Dynamics 🏼 Sometimes our solutions make things much worse Traditional View -vs- New View Safety
  12. Ironies of Automation • Lisanne Bainbridge, published in 1983 •

    Experts build and maintain their expertise by operating a system • Automation takes away those experiences ◦ Makes it difficult for a “human in the loop” to supervise ◦ Experts lose skills ◦ Onboarding more difficult ◦ More complex systems that are harder to reason about
  13. Tiers of Analysts Tier 1 Kid with no college. Watches

    sensor data and take notes Tier 2 Takes all the notes from Tier 1 and aggregates them into themes, locations, trends Tier 3 Takes the trend data from Tier 2 in produces intelligence and strategy.
  14. Tiers of Analysts Tier 1 Kid with no college. Watches

    sensor data and take notes Tier 2 Takes all the notes from Tier 1 and aggregates them into themes, locations, trends Tier 3 Takes the trend data from Tier 2 in produces intelligence and strategy. Let’s replace Tier 1 with some AI
  15. Tiers of Analysts Tier 1 Kid with no college. Watches

    sensor data and take notes Tier 2 Takes all the notes from Tier 1 and aggregates them into themes, locations, trends Tier 3 Takes the trend data from Tier 2 in produces intelligence and strategy.
  16. But also in your world too • Massive outages triggered

    by configuration changes • Edge cases triggering unexpected behavior as total activity scales • What is monitoring the monitoring system? • Onboarding Xooglers at USDS
  17. But also in your world too • Massive outages triggered

    by configuration changes • Edge cases triggering unexpected behavior • What is monitoring the monitoring system? • Onboarding Xooglers at USDS SREcon 2019 Asia/Pacific Ironies of Automation: A Comedy in Three Parts
  18. Optimization -vs- Resilience • We acknowledge that premature optimization has

    its downsides when it comes to costs and workload (SLOs) • But added more and more specific functionality is also a form of optimization (optimizing for a use case instead of performance) • Increased optimization often decreases resilience Wait? Why? 😟:
  19. Optimization -vs- Resilience • Multiple ways of doing the same

    things make technology more accessible • Those alternative pathways come in handy when a blocker to the primary operation appears • Variation is the first thing optimization tends to eliminate
  20. Training in War -vs- Peace • During peacetime, attitudes people

    want to build “elite” fighting forces and focus on weeding people out • But the reality of wars is that militaries need people to build expertise fast so that resources are going to action conflicts and not training new recruits. • When technology robs people of expertise rather than helping them build it, losing experts become hurts much more. • We’re watching this dynamic play out in the Ukraine right now, where the biggest blocker to military aid is training fighters to use the tech.
  21. In your world… • COTS solutions built for specific use

    cases reused for other things ◦ Salesforce, Wordpress, etc… ◦ Introduces security holes ◦ Dead/junk code ◦ Modifications not backed up anywhere • Missing market adjustments ◦ Facebook losing users to Twitter ◦ Instagram losing them to TikTok
  22. Human Error as a Social Function • The psychology of

    accident investigation: epistemological, preventive, moral and existential meaning-making Sidney. Dekker, 2014 • Society uses post-incident accountability to process grief ◦ suffering == human moral choice, ◦ Desire single acts and actors as cause of failure ◦ Accidents tend to have neither obvious causes nor clear, linear cause! effect relationships. • We can always find a human to blame ◦ Who was operating? ◦ Who was maintaining? ◦ Who was supervising? ◦ Who manufactured?
  23. Kasparov's Law • Humans paired with machines often beat both

    humans and machines alone • But only of they are working together in a specific way ❌ Computer recommending a move to the human ✅ Computer listing a set of possible responses to human’s planned move • People will always alter their use of technology so that they can blame the technology when failure happens • Fully delegating decisions to the machine • Using the computer in unapproved ways • They will do this even when this risks their lives (Tesla autopilot crashes)
  24. Spell Check -vs- Autocorrect I love sfatey I love fates

    • When it’s right you’re grateful • When it’s wrong we just ignore it • Impossible to passively delegate • When it’s right you don’t notice • When it’s wrong it creates confusion or annoyance • Extra work proactively overriding
  25. Jens Rasmussen’s Operations Gradient • economic failure • unacceptable work

    load • functionality acceptable performance THE SYSTEM IS ALWAYS ON THE VERGE OF FAILURE!!!!