Don't Get Distracted

A438eb5b27da0f50dc120f9bfbdd9c16?s=47 Caleb Hearth
February 17, 2018

Don't Get Distracted

In 2011, with a team of interns at a Department of Defense contractor, Caleb created a Wi-Fi geolocation app to locate hotspots. It could find the location in 3D space of every hotspot near you in seconds. His team made formulas to model signal strength and probable distances. They used machine learning to optimize completion time and accuracy. He was so caught up in the details that it took months to see it would be used to kill people. What do we do when we discover that we're building something immoral or unethical? How can we think through the uses of our software to avoid this problem entirely?

To actually see this talk, which does in fact have mostly empty slides, visit https://calebhearth.com/talks/dont-get-distracted.

A438eb5b27da0f50dc120f9bfbdd9c16?s=128

Caleb Hearth

February 17, 2018
Tweet

Transcript

  1. Don't Get Distracted A cautionary tale by @calebthompson

  2. CONTENT WARNING: This talk contains references to, but no details

    of, killing and suicide.
  3. These slides intentionally left blank.

  4. None
  5. None
  6. None
  7. None
  8. None
  9. None
  10. None
  11. None
  12. None
  13. None
  14. None
  15. None
  16. None
  17. None
  18. None
  19. None
  20. None
  21. None
  22. None
  23. None
  24. None
  25. None
  26. None
  27. None
  28. None
  29. None
  30. None
  31. None
  32. None
  33. None
  34. None
  35. None
  36. None
  37. None
  38. None
  39. None
  40. None
  41. None
  42. None
  43. None
  44. None
  45. None
  46. None
  47. None
  48. None
  49. None
  50. None
  51. None
  52. None
  53. None
  54. None
  55. None
  56. None
  57. None
  58. None
  59. None
  60. None
  61. None
  62. None
  63. None
  64. None
  65. None
  66. “As developers, we are often one of the last lines

    of defense against potentially dangerous and unethical practices. We’re approaching a time where software will drive the vehicle that transports your family to soccer practice. There are already AI programs that help doctors diagnose disease. It’s not hard to imagine them recommending prescription drugs soon, too. The more software continues to take over every aspect of our lives, the more important it will be for us to take a stand and ensure that our ethics are ever-present in our code. Since that day, I always try to think twice about the effects of my code before I write it. I hope that you will too.
  67. None
  68. None
  69. “This program denies ride requests to users who are violating

    our terms of service — whether that's people aiming to physically harm drivers, competitors looking to disrupt our operations, or opponents who collude with officials on secret 'stings' meant to entrap drivers.
  70. None
  71. None
  72. None
  73. None
  74. None
  75. None
  76. None
  77. None
  78. None
  79. “Well-intended actions, including those that accomplish assigned duties, may lead

    to harm unexpectedly. In such an event the responsible person or persons are obligated to undo or mitigate the negative consequences as much as possible. One way to avoid unintentional harm is to carefully consider potential impacts on all those affected by decisions made during design and implementation.
  80. None
  81. None
  82. None
  83. None
  84. None
  85. Kumail Nanjiani @kumailn 4:56 PM - 1 Nov 2017 Thread:

    I know there's a lot of scary stuff in the world rn, but this is something I've been thinking about that I can't get out of my head.
  86. Kumail Nanjiani @kumailn 4:57 PM - 1 Nov 2017 As

    a cast member on a show about tech, our job entails visiting tech companies/conferences etc. We meet ppl eager to show off new tech.
  87. Kumail Nanjiani @kumailn Often we'll see tech that is scary.

    I don't mean weapons etc. I mean altering video, tech that violates privacy, stuff w obv ethical issues. 4:59 PM - 1 Nov 2017
  88. Kumail Nanjiani 4:59 PM - 1 Nov 2017 And we'll

    bring up our concerns to them. We are realizing that ZERO consideration seems to be given to the ethical implications of tech. @kumailn
  89. Kumail Nanjiani 5:00 PM - 1 Nov 2017 They don't

    even have a pat rehearsed answer. They are shocked at being asked. Which means nobody is asking those questions. @kumailn
  90. Kumail Nanjiani 5:10 PM - 1 Nov 2017 "We're not

    making it for that reason but the way ppl choose to use it isn't our fault. Safeguard will develop." But tech is moving so fast. @kumailn
  91. Kumail Nanjiani @kumailn 5:11 PM - 1 Nov 2017 That

    there is no way humanity or laws can keep up. We don't even know how to deal with open death threats online. Kumail Nanjiani @kumailn
  92. Kumail Nanjiani 5:12 PM - 1 Nov 2017 Only "Can

    we do this?" Never "should we do this? We've seen that same blasé attitude in how Twitter or Facebook deal w abuse/fake news. @kumailn
  93. Kumail Nanjiani 5:12 PM - 1 Nov 2017 Tech has

    the capacity to destroy us. We see the negative effect of social media. & no ethical considerations are going into dev of tech. @kumailn
  94. Kumail Nanjiani @kumailn 5:13 PM - 1 Nov 2017 You

    can't put this stuff back in the box. Once it's out there, it's out there. And there are no guardians. It's terrifying. The end. Kumail Nanjiani @kumailn
  95. None
  96. None
  97. None
  98. None
  99. None
  100. None
  101. None
  102. None
  103. None
  104. None
  105. None
  106. None
  107. None
  108. None
  109. None
  110. Don't Get Distracted A cautionary tale by @calebthompson https://www.calebthompson.io/dont-get-distracted