Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Built-in Bias: What Are the Ethical Responsibi...

Built-in Bias: What Are the Ethical Responsibilities of Developers?

Computer scientists at Harvard built Artificial Intelligence (AI) which could be used to identify gang crimes. This development in the practice of predictive policing has immense potential to "ignite an ethical firestorm" as stated by Science Magazine. Do developers have an ethical responsibility when it comes to developing tech that ultimately becomes a tool of state violence? Should they have a say in how the tech they build should be used once it’s delivered to the client?

In this talk, I will examine how efforts to move AI forward are built on statistical biases, and why those who build these things should care. I will also discuss what a code of ethics could look like when approached by developers.

Further reading & resources: http://bit.ly/DevEthicsResources

Avatar for Stephanie Vaughn

Stephanie Vaughn

August 26, 2020
Tweet

More Decks by Stephanie Vaughn

Other Decks in Technology

Transcript

  1. About Me › Detroit native › Communications professional-turned-Techie › Front-End

    Web, leaning towards UX Design › Passionate about Tech Literacy + Computer Science Education @_SLVaughn
  2. Feel free to live-tweet this talk! Tag me while you

    livetweet: @_SLVaughn Use hashtags: #HiddenGemsConf #StillWeRise #DevEthics Let’s Engage!
  3. Abstract In this talk, I will examine how efforts to

    move AI forward are built on statistical biases, and why those who build these things should care. I will also discuss what a code of ethics could look like when approached by developers. @_SLVaughn
  4. Main Questions › Do developers have an ethical responsibility? ›

    Should they have a say in how the tech they build should be used? › Why should those who build software care about how it's used? › How would YOU personally approach defining a code of ethics? @_SLVaughn
  5. Do developers have an ethical responsibility when it comes to

    developing tech that ultimately becomes a tool of state violence? @_SLVaughn
  6. “Acts of violence committed by an official state, military or

    sponsored by a sovereign government outside of the context of a declared war, which target civilians or show a disregard for civilian life in attacking targets—either people or facilities.” First, let’s define “state violence” — "State violence." Segen's Medical Dictionary @_SLVaughn
  7. @_SLVaughn • A group of activists, organizers, and mathematicians committed

    to the mission of using data science to create concrete and measurable change in the lives of Black people. • Data for Black Lives seeks to mobilize scientists around racial justice issues.
  8. D4BL Opening panel: Abolition in the Age of Big Data

    • Templates: while most DBs are populated from passport photos or states that enroll all drivers license photos in the federal DB, many law enforcement DBs are populated with mug shots.
  9. D4BL Opening Panel: Abolition in the Age of Big Data

    • For areas with known issues around problematic policing, facial recognition DBs contain mostly mugshots of black and brown faces.
  10. D4BL Opening Panel: Abolition in the Age of Big Data

    • When facial recognition DBs fail to operate, they don’t fail to identify - many misidentify.
  11. D4BL Opening Panel: Abolition in the Age of Big Data

    • Underneath it all, high arrest rates are being rewarded. Priority is to find someone and report numbers to the city, rather than well-being of communities.
  12. “Technology in the hands of those in power will continue

    to be used to oppress” — Data for Black Lives: Abolition in the Age of Big Data @_SLVaughn
  13. TechCrunch: Harvard-MIT initiative grants $750K to projects looking to keep

    tech accountable › Grant recipients include: › Tattle, a project aiming to combat disinformation and false news spreading on WhatsApp. › Sidekick, which uses machine learning tools to help journalists scour thousands of pages of documents for interesting data. › The Rochester Institute of Technology will be using its grant to look into detecting manipulated video, both designing its own techniques and evaluating existing ones. @_SLVaughn
  14. “It’s naive to believe that the big corporate leaders in

    AI will ensure that these technologies are being leveraged in the public interest. Philanthropic funding has an important role to play in filling in the gaps and supporting initiatives that envision the possibilities for AI outside the for-profit context.” — Tim Hwang, Director - Ethics and Governance in AI Initiative @_SLVaughn
  15. Should developers have a say in how the tech they

    build should be used once it’s delivered to the client? @_SLVaughn
  16. “We call on Amazon to stop selling Rekognition to law

    enforcement as legislation and safeguards to prevent misuse are not in place.” — “On Recent Research Auditing Commercial Facial Analysis Technology” (March 26, 2019) @_SLVaughn
  17. Risk Assessment Predictions: Bernard Parker, left, was rated high risk;

    Dylan Fugett was rated low risk. Dylan was arrested three times on drug charges after that. Bernard was not. (Josh Ritchie for ProPublica) @_SLVaughn
  18. “This is a just like a terms and conditions document

    stating how far a developer is willing to go in the design, implementation and administering of software that can engage in cyber warfare activities and espionage. This would give an employer clear light on what a developer is willing to design, implement and administer.” — Jed Ogwumike @_SLVaughn
  19. • Always consider potential misuse... just as [you] would traditional

    security vulnerabilities. • If possible mitigate misuse during development—for example, taking privacy concerns into account and only including information that is needed rather than what is available. • Use for state violence is a manifestly greater concern than SQL-injection or buffer overflow. • We have disclosure practices and guidelines about code vulnerabilities. Surely we should have disclosure and guidelines about societal vulnerabilities. — Anonymous @_SLVaughn
  20. “I like the JSON license approach. Don't allow people to

    use my tech to harm others. Just start from there and work out any thorns along the way.” — Mark Henderson @_SLVaughn
  21. “If I'm delivering software, the user must use it within

    the scope of what the license states. In the license, I will provide ethical use of the software and hopefully provide provisions of penalties. I will also try to embed safeguards that allow the software to be used in a way where it will not violate the ethical license clauses.” — Troy Connor @_SLVaughn
  22. “I've helped organize in the tech industry, like Tech Workers

    Coalition (techworkerscoalition.org) does. Education & mentoring for organizing fellow tech workers seems more effective than a code of ethics.” — Sleepymachine @_SLVaughn
  23. “I would make it very similar to the Hippocratic Oath,

    particularly around non-maleficence.” — Amara Graham @_SLVaughn
  24. “I'm not a huge believer in code of ethics in

    software development. I find that the mob mentality kicks in and it veers to [sic] far in one direction - and becomes more harmful than good. I believe in a simple set of guidelines, used to push an agenda of common sense, courtesy and productivity.” — Andy Wojnarek @_SLVaughn
  25. “We need to fight against the information cartel and embrace

    open data and open source AI. There is no way in stopping somebody in misusing AI. So the job of developers is to denounce the black sheep and don't join the Dark Side of the force.” — Romeo Kienzler @_SLVaughn
  26. “I honestly think the code of ethics we should have

    is very simple: do no harm. This would require a lot of sacrifices, and perhaps people would feel we would be going backwards in our quality of life (a lot of the technology we use may harm others or the Earth in some capacity), but by and large, I think we, as a humanity, would be better for it.” — Samah Majadla @_SLVaughn
  27. “The basic idea is that the developer engages with users

    and stakeholders on an ongoing basis. If that is being done then ethical considerations would help determine priorities. One thing that should also be noted however, no level of "ethical vigilance" is going to eliminate ethical issues. In other words, trying to use technology to control human behavior is a fool's errand and ignores personal responsibility. Good technology is just a tool for the human. Bad technology often attempts to change or engineer human behavior before it evolves in an organic way.” — Keith C. Perry, MS E.E. @_SLVaughn
  28. “The question every developer should ask themselves is this: could

    the line of code I am putting down right now be used in any shape or form to create Skynet in the future? If the answer is yes, then, don't write that code. Period.” — Sarthak Pati @_SLVaughn
  29. Conclusion › Why should we have a code of ethics

    in software development? › How should we approach ethics in software development? @_SLVaughn
  30. Why should we have a code of ethics in software

    development? We spend the majority of our time working. Why would we want to put our life energy into something that consumes that much of our time without being thoughtful and intentional? — Devney Hamilton, Sassafras Tech Collective @_SLVaughn
  31. “Most professions have ethics standards & guidelines; software development is

    relatively new and that is taken for granted.” — Devney Hamilton, Sassafras Tech Collective Why should we have a code of ethics in software development? @_SLVaughn
  32. “No one practicing software development should take the lack of

    a code of ethics for granted. It’s not needed any less. Software impacts our lives to the same degree as seeing a doctor or seeking counsel from a lawyer.” — Devney Hamilton, Sassafras Tech Collective Why should we have a code of ethics in software development? @_SLVaughn
  33. How should we approach ethics in software development? › Transparency;

    true ethics requires open source. › Democratically controlled. › Built by teams that are representative of the world we live in. — Devney Hamilton, Sassafras Tech Collective @_SLVaughn