Upgrade to Pro — share decks privately, control downloads, hide ads and more …

IBM Twitch - Digital Discrimination: Cognitive Bias in Machine Learning

IBM Twitch - Digital Discrimination: Cognitive Bias in Machine Learning

Tools:
AI Fairness 360 Toolkit: http://aif360.mybluemix.net/
Model Asset Exchange: http://ibm.biz/model-exchange
Image Segmenter Web App: https://github.com/IBM/MAX-Image-Segmenter-Web-App
Diversity in Faces Dataset: https://www.research.ibm.com/artificial-intelligence/trusted-ai/diversity-in-faces/#acces

Sources:

Podcasts
https://leanin.org/podcast-episodes/siri-is-artificial-intelligence-biased
https://art19.com/shows/the-ezra-klein-show/episodes/663fd0b7-ee60-4e3e-b2cb-4fcb4040eef1

Amazon
https://www.aclu.org/blog/privacy-technology/surveillance-technologies/amazons-face-recognition-falsely-matched-28
https://www.openmic.org/news/2019/1/16/halt-rekognition

Google
https://motherboard.vice.com/en_us/article/j5jmj8/google-artificial-intelligence-bias

COMPAS
https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing
https://www.technologyreview.com/s/612775/algorithms-criminal-justice-ai/

Data for Black Lives
Conference Notes: https://docs.google.com/document/d/1E1mfgTp73QFRmNBunl8cIpyUmDos28rekidux0voTsg/edit?ts=5c39f92e

Gender Shades Project
http://news.mit.edu/2018/study-finds-gender-skin-type-bias-artificial-intelligence-systems-0212
https://www.youtube.com/watch?time_continue=1&v=TWWsW1w-BVo
https://www.ajlunited.org/fight

Other resources referenced in this talk:
https://www.nytimes.com/2018/02/12/business/computer-science-ethics-courses.html
https://www.vox.com/science-and-health/2017/4/17/15322378/how-artificial-intelligence-learns-how-to-be-racist
https://www.engadget.com/2019/01/24/pinterest-skin-tone-search-diversity/

Maureen McElaney

March 15, 2019
Tweet

More Decks by Maureen McElaney

Other Decks in Technology

Transcript

  1. Digital Discrimination:
    Cognitive Bias in
    Machine Learning
    Maureen McElaney
    Developer Advocate, Center for
    Open-Source Data and AI Technologies
    (CODAIT)
    @Mo_Mack
    December 19, 2018 / © 2018 IBM Corporation

    View Slide

  2. HELLO!
    I am Maureen McElaney
    You can find me at:
    @Mo_Mack
    2

    View Slide

  3. Digital Discrimination
    Cognitive Bias in Machine Learning
    3

    View Slide


  4. A cognitive bias is a systematic pattern of
    deviation from norm or rationality in
    judgment. Individuals create their own
    "subjective social reality" from their
    perception of the input.”
    - Wikipedia
    4

    View Slide

  5. 5
    https://twitter.com/alexisohanian/status/1087973027055316994

    View Slide

  6. Examples of Cognitive
    Bias in Machine
    Learning

    View Slide

  7. Google’s Cloud
    Natural Language
    API
    7 Image Credit: #WOCinTech

    View Slide

  8. October 2017 - Google Natural Language API
    https://cloud.google.com/natural-language/
    8
    Source: https://motherboard.vice.com/en_us/article/j5jmj8/google-artificial-intelligence-bias

    View Slide

  9. October 2017 - Google Natural Language API
    https://cloud.google.com/natural-language/
    9
    Source: https://motherboard.vice.com/en_us/article/j5jmj8/google-artificial-intelligence-bias

    View Slide

  10. October 2017 - Google Natural Language API
    https://cloud.google.com/natural-language/
    10
    Source: https://motherboard.vice.com/en_us/article/j5jmj8/google-artificial-intelligence-bias

    View Slide


  11. We will correct this specific case,
    and, more broadly, building more
    inclusive algorithms is crucial to
    bringing the benefits of machine
    learning to everyone."
    11

    View Slide

  12. NorthPointe’s
    COMPAS
    Algorithm
    12 Image Credit: #WOCinTech

    View Slide

  13. May 2016 - Northpointe’s COMPAS Algorithm
    http://www.equivant.com/solutions/inmate-classification
    13
    Source:
    https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing

    View Slide

  14. May 2016 - Northpointe’s COMPAS Algorithm
    http://www.equivant.com/solutions/inmate-classification
    14
    Source:
    https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing

    View Slide

  15. May 2016 - Northpointe’s COMPAS Algorithm
    http://www.equivant.com/solutions/inmate-classification
    15
    Source:
    https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing

    View Slide

  16. May 2016 - Northpointe’s COMPAS Algorithm
    http://www.equivant.com/solutions/inmate-classification
    16
    Source:
    https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing

    View Slide

  17. 17
    Source:
    https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing

    View Slide

  18. 18
    Source:
    https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing

    View Slide

  19. 19

    View Slide

  20. Amazon
    Rekognition
    20 Image Credit: #WOCinTech

    View Slide

  21. July 2018 - Amazon Rekognition
    https://aws.amazon.com/rekognition/
    21
    Source: https://www.aclu.org/blog/privacy-technology/surveillance-technologies/amazons-face-recognition-falsely-matched-28

    View Slide

  22. January 2019 - Amazon Rekognition
    https://aws.amazon.com/rekognition/
    22
    Source: https://www.openmic.org/news/2019/1/16/halt-rekognition

    View Slide

  23. Joy Buolamwini,
    Algorithmic Justice League
    Gender Shades Project
    Released February 2018
    23

    View Slide

  24. View Slide


  25. We have entered the age of automation
    overconfident, yet underprepared. If we fail
    to make ethical and inclusive artificial
    intelligence we risk losing gains made in civil
    rights and gender equity under the guise of
    machine neutrality.” - @jovialjoy
    25

    View Slide

  26. SOLUTIONS?
    26
    What can we do to combat bias in AI?

    View Slide

  27. 27
    https://art19.com/shows/the-ezra-klein-show/episodes/663fd0b7-ee60-4e3e-b2cb-4fcb4040eef1

    View Slide


  28. Coders are the most empowered
    laborers that have ever existed."
    - @anildash
    28

    View Slide

  29. EDUCATION IS
    KEY
    29 Image Credit: #WOCinTech

    View Slide

  30. 30
    https://www.nytimes.com/2018/02/12/business/computer-science-ethics-courses.html

    View Slide

  31. QUESTIONS POSED TO
    STUDENTS
    ○ Is the technology fair?
    ○ How do you make sure that the data is not
    biased?
    ○ Should machines be judging humans?
    31

    View Slide

  32. 32
    https://twitter.com/Neurosarda/status/1084198368526680064

    View Slide

  33. FIX THE
    PIPELINE?
    33 Image Credit: #WOCinTech

    View Slide

  34. Rediet Abebe, Black in AI
    “Cognitive bias in machine learning is human bias on
    steroids.” - @red_abebe
    34

    View Slide

  35. January 2019 - New Search Feature on
    https://www.pinterest.com/
    35
    Source: https://www.engadget.com/2019/01/24/pinterest-skin-tone-search-diversity/

    View Slide


  36. "By combining the latest in machine
    learning and inclusive product
    development, we're able to directly
    respond to Pinner feedback and
    build a more useful product."
    - Pinterest
    36

    View Slide

  37. TOOLS TO
    COMBAT BIAS
    37 Image Credit: #WOCinTech

    View Slide

  38. TOOL #1:
    AI Fairness 360 Toolkit
    Open Source Library

    View Slide

  39. http://aif360.mybluemix.net/

    View Slide

  40. View Slide

  41. TYPES OF METRICS
    ○ Individual vs. Group Fairness, or Both
    ○ Group Fairness: Data vs Model
    ○ Group Fairness: We’re All Equal vs What
    You See is What You Get
    ○ Group Fairness: Ratios vs Differences
    41

    View Slide

  42. View Slide

  43. View Slide

  44. Machine Learning Pipeline
    In-
    Processing
    Pre-
    Processing
    Post-
    Processing
    44
    Modifying the
    training data.
    Modifying the
    learning
    algorithm.
    Modifying the
    predictions (or
    outcomes.)

    View Slide

  45. View Slide

  46. DEMO
    http://aif360.mybluemix.net/

    View Slide

  47. https://github.com
    /IBM/AIF360
    AI Fairness 360 Toolkit Public Repo
    47

    View Slide

  48. http://aif360.mybluemix.
    net/community
    AI Fairness 360 Slack Community
    48

    View Slide

  49. TOOL #2:
    Model Asset Exchange
    Open Source Pre-Trained Deep Learning
    Models

    View Slide

  50. http://ibm.biz/model-exchange

    View Slide

  51. http://ibm.biz/model-exchange

    View Slide

  52. https://github.com/IBM/MAX-Image-Segmenter-Web-App

    View Slide

  53. https://github.com/IBM/MAX-Image-Segmenter-Web-App

    View Slide

  54. http://ibm.biz/
    model-exchange
    Model Asset eXchange
    54

    View Slide

  55. TOOL #3:
    Diversity in Faces
    Dataset
    Diverse dataset available to the global
    research community.

    View Slide

  56. Diversity in Faces Dataset
    Studying diversity in faces is complex. The dataset
    provides a jumping off point for the global research
    community to further our collective knowledge.
    56

    View Slide

  57. https://www.research.ib
    m.com/artificial-intellige
    nce/trusted-ai/diversity-i
    n-faces/#access
    Diversity in Faces Dataset
    57

    View Slide

  58. UPDATE TO THE
    GENDER SHADES
    PROJECT
    58 Image Credit: #WOCinTech

    View Slide

  59. 59
    http://www.aies-conference.com/wp-content/uploads/2019/01/AIES-19_paper_223.pdf

    View Slide

  60. 60
    https://www.ajlunited.org/fight

    View Slide

  61. 61
    Photo by rawpixel on Unsplash

    View Slide

  62. Links to Sources:
    Podcasts
    ○ https://leanin.org/podcast-episodes/siri-is-artificial-intelligence-biased
    ○ https://art19.com/shows/the-ezra-klein-show/episodes/663fd0b7-ee60-4e3e-b2cb-4fcb4040eef1
    Amazon
    ○ https://www.aclu.org/blog/privacy-technology/surveillance-technologies/amazons-face-recognition-falsely-matched
    -28
    ○ https://www.openmic.org/news/2019/1/16/halt-rekognition
    Google
    ○ https://motherboard.vice.com/en_us/article/j5jmj8/google-artificial-intelligence-bias
    COMPAS
    ○ https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing
    ○ https://www.technologyreview.com/s/612775/algorithms-criminal-justice-ai/
    Data for Black Lives
    ○ Conference Notes:
    https://docs.google.com/document/d/1E1mfgTp73QFRmNBunl8cIpyUmDos28rekidux0voTsg/edit?ts=5c39f92e
    Gender Shades Project
    ○ http://news.mit.edu/2018/study-finds-gender-skin-type-bias-artificial-intelligence-systems-0212
    ○ https://www.youtube.com/watch?time_continue=1&v=TWWsW1w-BVo
    ○ https://www.ajlunited.org/fight
    ○ https://www.nytimes.com/2018/02/12/business/computer-science-ethics-courses.html
    ○ https://www.vox.com/science-and-health/2017/4/17/15322378/how-artificial-intelligence-learns-how-to-be-racist
    ○ https://www.engadget.com/2019/01/24/pinterest-skin-tone-search-diversity/
    62

    View Slide

  63. THANKS!
    Slides and Links:
    http://bit.ly/ibm-twitch-bias
    Any questions?
    You can find me at:
    @Mo_Mack
    63

    View Slide