Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Women in Data Science Conference - Digital Discrimination: Cognitive Bias in Machine Learning

Women in Data Science Conference - Digital Discrimination: Cognitive Bias in Machine Learning

Maureen McElaney

April 13, 2019
Tweet

More Decks by Maureen McElaney

Other Decks in Technology

Transcript

  1. Digital Discrimination
    Cognitive Bias in Machine Learning
    1

    View Slide


  2. “A cognitive bias is a systematic pattern
    of deviation from norm or rationality in
    judgment. Individuals create their own
    "subjective social reality" from their
    perception of the input.”
    - Wikipedia
    2

    View Slide

  3. 3
    https://twitter.com/alexisohanian/status/1087973027055316994

    View Slide

  4. Example of Cognitive
    Bias in Machine
    Learning

    View Slide

  5. Northpointe’s
    COMPAS
    Algorithm
    5 Image Credit: #WOCinTech

    View Slide

  6. May 2016 - Northpointe’s COMPAS Algorithm
    http://www.equivant.com/solutions/inmate-classification
    6
    Source:
    https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing

    View Slide

  7. May 2016 - Northpointe’s COMPAS Algorithm
    http://www.equivant.com/solutions/inmate-classification
    7
    Source:
    https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing

    View Slide

  8. May 2016 - Northpointe’s COMPAS Algorithm
    http://www.equivant.com/solutions/inmate-classification
    8
    Source:
    https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing

    View Slide

  9. May 2016 - Northpointe’s COMPAS Algorithm
    http://www.equivant.com/solutions/inmate-classification
    9
    Source:
    https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing

    View Slide

  10. 10
    Source:
    https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing

    View Slide

  11. 11
    Source:
    https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing

    View Slide

  12. 12

    View Slide


  13. “We have entered the age of automation
    overconfident, yet underprepared. If we fail
    to make ethical and inclusive artificial
    intelligence we risk losing gains made in civil
    rights and gender equity under the guise of
    machine neutrality.” - @jovialjoy
    13

    View Slide

  14. SOLUTIONS?
    14
    What can we do to combat bias in AI?

    View Slide

  15. 15
    https://art19.com/shows/the-ezra-klein-show/episodes/663fd0b7-ee60-4e3e-b2cb-4fcb4040eef1

    View Slide


  16. “Coders are the most empowered
    laborers that have ever existed.”
    - @anildash
    16

    View Slide

  17. EDUCATION IS
    KEY
    17 Image Credit: #WOCinTech

    View Slide

  18. 18
    https://www.nytimes.com/2018/02/12/business/computer-science-ethics-courses.html

    View Slide

  19. QUESTIONS POSED TO
    STUDENTS
    ○ Is the technology fair?
    ○ How do you make sure that the data is not
    biased?
    ○ Should machines be judging humans?
    19

    View Slide

  20. 20
    https://twitter.com/Neurosarda/status/1084198368526680064

    View Slide

  21. FIX THE
    PIPELINE?
    21 Image Credit: #WOCinTech

    View Slide

  22. Rediet Abebe, Black in AI
    “Cognitive bias in machine learning is human bias on
    steroids.” - @red_abebe
    22

    View Slide

  23. January 2019 - New Search Feature on
    https://www.pinterest.com/
    23
    Source: https://www.engadget.com/2019/01/24/pinterest-skin-tone-search-diversity/

    View Slide


  24. “By combining the latest in machine
    learning and inclusive product
    development, we're able to directly
    respond to Pinner feedback and
    build a more useful product.”
    - Pinterest
    24

    View Slide

  25. TOOLS TO
    COMBAT BIAS
    25 Image Credit: #WOCinTech

    View Slide

  26. TOOL #1:
    AI Fairness 360 Toolkit
    Open Source Library

    View Slide

  27. http://aif360.mybluemix.net/

    View Slide

  28. View Slide

  29. DEMO
    http://aif360.mybluemix.net/

    View Slide

  30. https://github.com
    /IBM/AIF360
    AI Fairness 360 Toolkit Public Repo
    30

    View Slide

  31. http://aif360.mybluemix.
    net/community
    AI Fairness 360 Slack Community
    31

    View Slide

  32. TOOL #2:
    Model Asset Exchange
    Open Source Pre-Trained Deep Learning
    Models

    View Slide

  33. http://ibm.biz/model-exchange

    View Slide

  34. http://ibm.biz/model-exchange

    View Slide

  35. http://ibm.biz/
    model-exchange
    Model Asset eXchange
    35

    View Slide

  36. 36
    Photo by rawpixel on Unsplash

    View Slide

  37. THANKS!
    Slides and Links:
    http://bit.ly/wids-bias
    Any questions?
    You can find me on Twitter at:
    @Mo_Mack
    37

    View Slide