Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Red Hat DevNation - Digital Discrimination: Cognitive Bias in Machine Learning

Red Hat DevNation - Digital Discrimination: Cognitive Bias in Machine Learning

Tools/Communities:
Enter IBM's Call for Code Competition: https://ibm.biz/BdzPJn
AI Fairness 360 Toolkit: http://aif360.mybluemix.net/
Model Asset Exchange: http://ibm.biz/model-exchange
IBM's Data Science and AI Elite Team: http://community.ibm.com/DSE
Watson Studio: https://www.ibm.com/cloud/watson-studio
Watson Machine Learning: https://www.ibm.com/cloud/machine-learning
Watson OpenScale: https://www.ibm.com/cloud/watson-openscale/

Talk Sources:

Cognitive Bias Definition:
https://adweb.clarkson.edu/~awilke/Research_files/EoHB_Wilke_12.pdf

Podcasts/Tweets
https://leanin.org/podcast-episodes/siri-is-artificial-intelligence-biased
https://art19.com/shows/the-ezra-klein-show/episodes/663fd0b7-ee60-4e3e-b2cb-4fcb4040eef1
https://twitter.com/alexisohanian/status/1087973027055316994
https://twitter.com/MatthewBParksSr/status/1133435312921874432

Northpointe/Equivant's COMPAS - Recidivism Score Algorithm
https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing
https://www.technologyreview.com/s/612775/algorithms-criminal-justice-ai/

Data for Black Lives
http://d4bl.org/about.html
2019 Conference Notes: https://docs.google.com/document/d/1E1mfgTp73QFRmNBunl8cIpyUmDos28rekidux0voTsg/edit?ts=5c39f92e

Gender Shades Project
http://news.mit.edu/2018/study-finds-gender-skin-type-bias-artificial-intelligence-systems-0212
MIT Media Lab Overview for the project: https://www.youtube.com/watch?time_continue=1&v=TWWsW1w-BVo
FAT* 2018 Talk about outcomes: https://www.youtube.com/watch?v=Af2VmR-iGkY
https://www.ajlunited.org/fight

Other articles referenced in this talk:
https://www.nytimes.com/2018/02/12/business/computer-science-ethics-courses.html
https://www.vox.com/science-and-health/2017/4/17/15322378/how-artificial-intelligence-learns-how-to-be-racist
https://www.engadget.com/2019/01/24/pinterest-skin-tone-search-diversity/

Maureen McElaney

June 25, 2019
Tweet

More Decks by Maureen McElaney

Other Decks in Technology

Transcript

  1. Red Hat DevNation
    Digital Discrimination:
    Cognitive Bias in Machine
    Learning
    June 27, 2019

    View Slide

  2. Digital Discrimination:
    Cognitive Bias in Machine
    Learning
    My name is:
    Maureen McElaney
    Tweet at me! @Mo_Mack
    2
    @Mo_Mack

    View Slide

  3. A cognitive bias is a systematic pattern of
    deviation from norm or rationality in
    judgment.
    People make decisions given their limited
    resources.
    Wilke A. and Mata R. (2012) “Cognitive Bias”, Clarkson University
    3
    @Mo_Mack

    View Slide

  4. 4
    @Mo_Mack

    View Slide

  5. Example of bias in machine
    learning.
    5
    @Mo_Mack

    View Slide

  6. NorthPointe’s
    COMPAS
    Algorithm
    6 Image Credit: #WOCinTech

    View Slide

  7. Source: https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing
    May 2016 - Northpointe’s COMPAS Algorithm
    http://www.equivant.com/solutions/inmate-
    classification
    7
    @Mo_Mack

    View Slide

  8. May 2016 - Northpointe’s COMPAS Algorithm
    http://www.equivant.com/solutions/inmate-
    classification
    Source: https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing
    @Mo_Mack

    View Slide

  9. May 2016 - Northpointe’s COMPAS Algorithm
    http://www.equivant.com/solutions/inmate-
    classification
    Source: https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing
    @Mo_Mack

    View Slide

  10. May 2016 - Northpointe’s COMPAS Algorithm
    http://www.equivant.com/solutions/inmate-
    classification
    Source: https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing
    @Mo_Mack

    View Slide

  11. Source: https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing
    11
    Black Defendant’s Risk Scores
    @Mo_Mack

    View Slide

  12. Source: https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing
    12
    White Defendant’s Risk Scores
    @Mo_Mack

    View Slide

  13. BLACK VS. WHITE
    DEFENDANTS
    ○ Falsely labeled black defendants as likely
    of future crime at twice the rate as white
    defendants.
    ○ White defendants mislabeled as low risk
    more than black defendants
    ○ Pegged Black defendants 77% more likely
    to be at risk of committing future violent
    crime
    13
    @Mo_Mack

    View Slide

  14. 14
    @Mo_Mack

    View Slide

  15. Gender
    Shades Project
    15 Image Credit: #WOCinTech

    View Slide

  16. Joy Buolamwini,
    Algorithmic Justice League
    Gender Shades Project
    Released February 2018
    16
    @Mo_Mack

    View Slide

  17. View Slide

  18. “If we fail to make
    ethical and inclusive
    artificial intelligence
    we risk losing gains
    made in civil rights
    and gender equity
    under the guise of
    machine neutrality.”
    18
    - Joy Boulamwini
    @jovialjoy

    View Slide

  19. Solutions?
    What can we do
    to combat bias
    in AI?
    19
    @Mo_Mack

    View Slide

  20. 20
    https://www.vox.com/ezra-klein-show-podcast
    @Mo_Mack

    View Slide

  21. “Coders are the
    most empowered
    laborers that have
    ever existed.”
    21
    - Anil Dash
    @anildash

    View Slide

  22. EDUCATION IS
    KEY
    22 Image Credit: #WOCinTech

    View Slide

  23. https://www.nytimes.com/2018/02/12/business/computer-science-
    ethics-courses.html
    @Mo_Mack

    View Slide

  24. Questions
    posed to
    students
    in these
    courses...
    Is the
    technology
    fair?
    How do you
    make sure
    that the
    data is not
    biased?
    Should
    machines
    be judging
    humans?
    24
    @Mo_Mack

    View Slide

  25. 25
    https://twitter.com/Neurosarda/status/1084198368526680064

    View Slide

  26. FIX THE
    PIPELINE?
    26 Image Credit: #WOCinTech

    View Slide

  27. “Cognitive bias in
    machine learning is
    human bias on
    steroids.”
    27
    - Rediet Abebe
    @red_abebe

    View Slide

  28. 28
    https://twitter.com/MatthewBParksSr/status/1133435312921874432
    @Mo_Mack

    View Slide

  29. TOOLS TO
    COMBAT BIAS
    29 Image Credit: #WOCinTech

    View Slide

  30. Tool #1:
    AI Fairness
    360 Toolkit
    Open Source Library
    30
    @Mo_Mack

    View Slide

  31. http://aif360.mybluemix.net/
    @Mo_Mack

    View Slide

  32. http://aif360.mybluemix.net/
    @Mo_Mack

    View Slide

  33. TYPES OF METRICS
    ○ Individual vs. Group Fairness, or Both
    ○ Group Fairness: Data vs Model
    ○ Group Fairness: We’re All Equal vs What
    You See is What You Get
    ○ Group Fairness: Ratios vs Differences
    33
    @Mo_Mack

    View Slide

  34. @Mo_Mack

    View Slide

  35. @Mo_Mack

    View Slide

  36. Machine Learning
    Pipeline
    In-
    Processing
    Pre-
    Processing
    Post-
    Processing
    36
    Modifying the
    training data.
    Modifying the
    learning
    algorithm.
    Modifying the
    predictions (or
    outcomes.)
    @Mo_Mack

    View Slide

  37. @Mo_Mack

    View Slide

  38. http://aif360.mybluemix.net/
    Demos
    @Mo_Mack

    View Slide

  39. https://github.com/IBM/AIF360
    AI Fairness 360 Toolkit Public Repo
    39
    @Mo_Mack

    View Slide

  40. http://aif360.mybluemix.net/community
    AI Fairness 360 Toolkit Slack
    40
    @Mo_Mack

    View Slide

  41. Tool #2:
    Model Asset
    Exchange
    Open Source Pre-Trained
    Deep Learning Models
    41
    @Mo_Mack

    View Slide

  42. Step 1: Find a model
    ...that does what you need
    ...that is free to use
    ...that is performant enough
    42
    @Mo_Mack

    View Slide

  43. Step 2: Get the code
    Is there a good implementation available?
    ...that does what you need
    ...that is free to use
    ...that is performant enough
    43
    @Mo_Mack

    View Slide

  44. Step 3: Verify the
    model
    ○ Does it do what you need?
    ○ Is it free to use (license)?
    ○ Is it performant enough?
    ○ Accuracy?
    44
    @Mo_Mack

    View Slide

  45. Step 4: Train the model
    45
    @Mo_Mack

    View Slide

  46. Step 4: Train the model
    46
    @Mo_Mack

    View Slide

  47. Step 5: Deploy your
    model
    ○ Adjust inference code (or write from
    scratch)
    ○ Package inference code, model code, and
    pre-trained weights together
    ○ Deploy your package
    47
    @Mo_Mack

    View Slide

  48. Step 6: Consume your
    model
    48
    @Mo_Mack

    View Slide

  49. Model Asset
    Exchange
    The Model Asset Exchange (MAX) is a one
    stop shop for developers/data scientists to
    find and use free and open source deep
    learning models
    ibm.biz/model-exchange
    49
    @Mo_Mack

    View Slide

  50. ○ Wide variety of domains (text, audio,
    images, etc)
    ○ Multiple deep learning frameworks
    ○ Vetted and tested code/IP
    ○ Build and deploy a model web service in
    seconds
    50
    Model Asset
    Exchange
    @Mo_Mack

    View Slide

  51. ibm.biz/model-exchange
    51
    @Mo_Mack

    View Slide

  52. http://ibm.biz/model-exchange
    http://ibm.biz/max-slack
    Model Asset eXchange (MAX)
    52
    @Mo_Mack

    View Slide

  53. Take Control
    of the Machine
    Learning
    Pipeline
    53
    @Mo_Mack

    View Slide

  54. 54
    IBM’s AI Portfolio
    Everything you need for Enterprise AI, on any cloud
    Watson
    Studio
    Watson
    Machine
    Learning
    Watson AI
    OpenScale
    Build Deploy Manage
    Interact with Pre-built AI Services
    Watson Application Services
    Unify on a Multicloud Data Platform
    IBM Cloud Private for Data
    AI Open Source Frameworks
    @Mo_Mack

    View Slide

  55. © 2018 IBM Corporation 30 April 2019 IBM Data Science Elite
    55
    Need help?
    IBM Data Science & AI Elite team.
    Find them: community.ibm.com/DSE
    @Mo_Mack

    View Slide

  56. UPDATE TO
    THE GENDER
    SHADES
    PROJECT
    56 Image Credit: #WOCinTech

    View Slide

  57. 57
    http://www.aies-conference.com/wp-content/uploads/2019/01/AIES-19_paper_223.pdf
    @Mo_Mack

    View Slide

  58. 58
    FAT* 2018: Joy Buolamwini - Intersectional Accuracy Disparities in Commercial
    Gender Classification
    https://www.youtube.com/watch?v=Af2VmR-iGkY
    @Mo_Mack

    View Slide

  59. 59
    Photo by rawpixel on Unsplash
    No matter what it is our
    responsibility to build
    systems that are fair.
    @Mo_Mack

    View Slide

  60. 60
    @Mo_Mack

    View Slide

  61. Enter the Call for Code
    https://ibm.biz/BdzPJn
    Slides
    http://bit.ly/redhat-biasinai
    Any questions?
    @Mo_Mack
    61

    View Slide