Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Windy City DevFest - Digital Discrimination: Cognitive Bias in Machine Learning

Windy City DevFest - Digital Discrimination: Cognitive Bias in Machine Learning

Tools:
AI Fairness 360 Toolkit: http://aif360.mybluemix.net/
Model Asset Exchange: http://ibm.biz/model-exchange
Image Segmenter Web App: https://github.com/IBM/MAX-Image-Segmenter-Web-App
Diversity in Faces Dataset: https://www.research.ibm.com/artificial-intelligence/trusted-ai/diversity-in-faces/#acces

Sources:

Podcasts
https://leanin.org/podcast-episodes/siri-is-artificial-intelligence-biased
https://art19.com/shows/the-ezra-klein-show/episodes/663fd0b7-ee60-4e3e-b2cb-4fcb4040eef1

Amazon
https://www.aclu.org/blog/privacy-technology/surveillance-technologies/amazons-face-recognition-falsely-matched-28
https://www.openmic.org/news/2019/1/16/halt-rekognition

Google
https://motherboard.vice.com/en_us/article/j5jmj8/google-artificial-intelligence-bias

COMPAS
https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing
https://www.technologyreview.com/s/612775/algorithms-criminal-justice-ai/

Data for Black Lives
Conference Notes: https://docs.google.com/document/d/1E1mfgTp73QFRmNBunl8cIpyUmDos28rekidux0voTsg/edit?ts=5c39f92e

Gender Shades Project
http://news.mit.edu/2018/study-finds-gender-skin-type-bias-artificial-intelligence-systems-0212
https://www.youtube.com/watch?time_continue=1&v=TWWsW1w-BVo
https://www.ajlunited.org/fight

https://www.nytimes.com/2018/02/12/business/computer-science-ethics-courses.html
https://www.vox.com/science-and-health/2017/4/17/15322378/how-artificial-intelligence-learns-how-to-be-racist
https://www.engadget.com/2019/01/24/pinterest-skin-tone-search-diversity/

Maureen McElaney

February 01, 2019
Tweet

More Decks by Maureen McElaney

Other Decks in Technology

Transcript

  1. Digital Discrimination: Cognitive Bias in Machine Learning Maureen McElaney Developer

    Advocate, Center for Open-Source Data and AI Technologies (CODAIT) @Mo_Mack December 19, 2018 / © 2018 IBM Corporation
  2. “ A cognitive bias is a systematic pattern of deviation

    from norm or rationality in judgment. Individuals create their own "subjective social reality" from their perception of the input.” - Wikipedia 4
  3. October 2017 - Google Natural Language API https://cloud.google.com/natural-language/ 8 Source:

    https://motherboard.vice.com/en_us/article/j5jmj8/google-artificial-intelligence-bias
  4. October 2017 - Google Natural Language API https://cloud.google.com/natural-language/ 9 Source:

    https://motherboard.vice.com/en_us/article/j5jmj8/google-artificial-intelligence-bias
  5. October 2017 - Google Natural Language API https://cloud.google.com/natural-language/ 10 Source:

    https://motherboard.vice.com/en_us/article/j5jmj8/google-artificial-intelligence-bias
  6. “ We will correct this specific case, and, more broadly,

    building more inclusive algorithms is crucial to bringing the benefits of machine learning to everyone." 11
  7. 19

  8. “ We have entered the age of automation overconfident, yet

    underprepared. If we fail to make ethical and inclusive artificial intelligence we risk losing gains made in civil rights and gender equity under the guise of machine neutrality.” - @jovialjoy 25
  9. QUESTIONS POSED TO STUDENTS ◦ Is the technology fair? ◦

    How do you make sure that the data is not biased? ◦ Should machines be judging humans? 31
  10. Rediet Abebe, Black in AI “Cognitive bias in machine learning

    is human bias on steroids.” - @red_abebe 34
  11. January 2019 - New Search Feature on https://www.pinterest.com/ 35 Source:

    https://www.engadget.com/2019/01/24/pinterest-skin-tone-search-diversity/
  12. “ "By combining the latest in machine learning and inclusive

    product development, we're able to directly respond to Pinner feedback and build a more useful product." - Pinterest 36
  13. TYPES OF METRICS ◦ Individual vs. Group Fairness, or Both

    ◦ Group Fairness: Data vs Model ◦ Group Fairness: We’re All Equal vs What You See is What You Get ◦ Group Fairness: Ratios vs Differences 41
  14. Machine Learning Pipeline In- Processing Pre- Processing Post- Processing 44

    Modifying the training data. Modifying the learning algorithm. Modifying the predictions (or outcomes.)
  15. Diversity in Faces Dataset Studying diversity in faces is complex.

    The dataset provides a jumping off point for the global research community to further our collective knowledge. 55
  16. Links to Sources: Podcasts ◦ https://leanin.org/podcast-episodes/siri-is-artificial-intelligence-biased ◦ https://art19.com/shows/the-ezra-klein-show/episodes/663fd0b7-ee60-4e3e-b2cb-4fcb4040eef1 Amazon ◦

    https://www.aclu.org/blog/privacy-technology/surveillance-technologies/amazons-face-recognition-falsely-matched -28 ◦ https://www.openmic.org/news/2019/1/16/halt-rekognition Google ◦ https://motherboard.vice.com/en_us/article/j5jmj8/google-artificial-intelligence-bias COMPAS ◦ https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing ◦ https://www.technologyreview.com/s/612775/algorithms-criminal-justice-ai/ Data for Black Lives ◦ Conference Notes: https://docs.google.com/document/d/1E1mfgTp73QFRmNBunl8cIpyUmDos28rekidux0voTsg/edit?ts=5c39f92e Gender Shades Project ◦ http://news.mit.edu/2018/study-finds-gender-skin-type-bias-artificial-intelligence-systems-0212 ◦ https://www.youtube.com/watch?time_continue=1&v=TWWsW1w-BVo ◦ https://www.ajlunited.org/fight ◦ https://www.nytimes.com/2018/02/12/business/computer-science-ethics-courses.html ◦ https://www.vox.com/science-and-health/2017/4/17/15322378/how-artificial-intelligence-learns-how-to-be-racist ◦ https://www.engadget.com/2019/01/24/pinterest-skin-tone-search-diversity/ 61