$30 off During Our Annual Pro Sale. View Details »

Ethics for the AI Age

cennydd
February 10, 2017

Ethics for the AI Age

[As presented at Interaction ’17, NYC.]

Over the next two decades, connected products will demand an unprecedented amount of user trust. Technologists and designers will ask the public for yet more of their attention, more of their data, more of their lives. AIs will know users’ deepest secrets. Co-operating devices will automate security and safety. Autonomous vehicles will even make life-or-death decisions for passengers.

But ours is an industry still unwilling to grapple with the ethical, social, and political angles of this future. We mistakenly believe that technology is neutral; that mere objects cannot have moral relevance. And so we make embarrassing blunders – racist chatbots, manipulative research, privacy violations – that undermine trust and harm those we should help.

This is a dangerous trajectory. We urgently need a deeper ethical dialogue about emerging technology, and interaction design’s role within it.

cennydd

February 10, 2017
Tweet

More Decks by cennydd

Other Decks in Technology

Transcript

  1. @CENNYDD
    ETHICS

    FOR THE

    AI AGE

    View Slide

  2. The neutrality
    fallacy.

    View Slide

  3. Technology

    is not neutral.

    View Slide

  4. bbc.co.uk · 25 march 2016
    theguardian.com · 2 october 2014
    azcentral.com · 31 October 2016
    cnn.com · 20 april 2016

    View Slide

  5. The industry
    hasn’t earned
    trust.

    View Slide

  6. Design is

    applied ethics.

    View Slide

  7. Progress
    deserves ethical
    accompaniment.

    View Slide

  8. ETHICAL
    CROSSROADS

    View Slide

  9. The promise of
    automation & AI.

    View Slide

  10. AI inherits bias.

    View Slide

  11. View Slide

  12. Agency vs.
    concealment.

    View Slide

  13. “We have to demand
    friction and truth from
    the products we use.”
    —Kelsey Campbell-Dollaghan

    View Slide

  14. Physical threat.

    View Slide

  15. View Slide

  16. “The answer is almost always
    ‘slam on the brakes’.”
    —Andrew Chatham

    View Slide

  17. The trolley problem
    isn’t.

    View Slide

  18. View Slide

  19. Control,

    handover,

    & the deadly seams.

    View Slide

  20. Level 0
    Level 1
    Level 2
    Level 3
    Level 4
    Level 5
    Light automation of specific functions e.g. cruise/lane control
    Autonomous steering, acceleration, braking. Driver disengaged but ready.
    Autonomous in limited environments e.g. freeways. Driver can relax.
    Fully autonomous except e.g. severe weather. Driver attention not required.
    Full autonomy to any legal location in all conditions.
    No autonomy, just warnings

    View Slide

  21. What limits to
    automation?

    View Slide

  22. View Slide

  23. Automation, labour,
    and capital.

    View Slide

  24. THE ETHICAL
    DESIGNER

    View Slide

  25. We become more
    moral by

    working at it.

    View Slide

  26. Four ethical tests:

    View Slide

  27. What if everyone did
    what I’m about to do?
    (DEONTOLOGICAL ETHICS)
    ETHICAL TEST #1

    View Slide

  28. Am I treating people
    as ends or means?
    ETHICAL TEST #2
    (DEONTOLOGICAL ETHICS)

    View Slide

  29. Am I maximising
    happiness for the
    greatest number?
    (UTILITARIANISM / CONSEQUENTIALIST ETHICS)
    ETHICAL TEST #3

    View Slide

  30. Would I be happy for
    this to be published in
    tomorrow’s papers?
    (VIRTUE ETHICS)
    ETHICAL TEST #4

    View Slide

  31. Choosing an
    ethical lens.

    View Slide

  32. Are some verticals
    unethical?

    View Slide

  33. Accessibility is an
    ethical litmus.

    View Slide

  34. You can’t change it
    from the inside.

    View Slide

  35. THE ETHICAL
    COMPANY

    View Slide

  36. Business case:
    revenue, cost, risk.

    View Slide

  37. Build ethical
    infrastructure.

    View Slide

  38. Appoint a designated
    dissenter.

    View Slide

  39. Diversity is an ethical
    early-warning system.

    View Slide

  40. Research has several
    ethical benefits.

    View Slide

  41. Policies and docs
    sometimes help.

    View Slide

  42. Policies and docs
    sometimes help.

    View Slide

  43. But culture dictates.

    View Slide

  44. THE ETHICAL
    COMMUNITY

    View Slide

  45. The politics of tech:

    View Slide

  46. Technological
    determinism.

    View Slide

  47. Internet
    exceptionalism.

    View Slide

  48. Data as ideology.

    View Slide

  49. Public vs.

    private morality.

    View Slide

  50. Disregarding the
    politics of our work is
    itself a political act.

    View Slide

  51. Regulation won’t
    save us.

    View Slide

  52. Nor will

    codes of ethics.

    View Slide

  53. It’s down to us.

    View Slide

  54. FURTHER READING
    Moralizing Technology Peter-Paul Verbeek
    Design for Real Life Eric Meyer & Sara Wachter-Boettcher
    The Little Book of Design Research Ethics IDEO
    The Republic Plato
    Groundwork of the Metaphysic of Morals Immanuel Kant
    Utilitarianism JS Mill
    Introducing Ethics Dave Robinson, Chris Garratt
    Authority and American Usage David Foster Wallace
    Four Futures: Life After Capitalism Peter Frase
    The Politics of Bitcoin: Software as Right-Wing Extremism David Golumbia

    View Slide

  55. @CENNYDD
    THANK YOU.

    View Slide