Slide 1

Slide 1 text

Arthur Doler @arthurdoler arthurdoler@gmail.com Slides: Handout: AN AI WITH AN AGENDA How Our Biases Leak Into Machine Learning bit.ly/art-ai-with-agenda None

Slide 2

Slide 2 text

LET’S ALL PLAY A GAME

Slide 3

Slide 3 text

“THE NURSE SAID”

Slide 4

Slide 4 text

“THE SOFTWARE ENGINEER SAID”

Slide 5

Slide 5 text

No content

Slide 6

Slide 6 text

No content

Slide 7

Slide 7 text

No content

Slide 8

Slide 8 text

No content

Slide 9

Slide 9 text

No content

Slide 10

Slide 10 text

No content

Slide 11

Slide 11 text

No content

Slide 12

Slide 12 text

No content

Slide 13

Slide 13 text

REAL CONSEQUENCES

Slide 14

Slide 14 text

https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing

Slide 15

Slide 15 text

https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing

Slide 16

Slide 16 text

http://blog.conceptnet.io/posts/2017/how-to-make-a-racist-ai-without-really-trying/ Aylin Caliskan-Islam1 , Joanna J. Bryson1,2, and Arvind Narayanan1, 2016

Slide 17

Slide 17 text

http://blog.conceptnet.io/posts/2017/how-to-make-a-racist-ai-without-really-trying/

Slide 18

Slide 18 text

No content

Slide 19

Slide 19 text

SIX CLASSES OF PROBLEM WITH AI/ML

Slide 20

Slide 20 text

No content

Slide 21

Slide 21 text

No content

Slide 22

Slide 22 text

No content

Slide 23

Slide 23 text

Class I – Phantoms of False Correlation Class II – Specter of Biased Sample Data Class III – Shade of Overly-Simplistic Maximization Class V – The Simulation Surprise Class VI – Apparition of Fairness Class VII – The Feedback Devil

Slide 24

Slide 24 text

No content

Slide 25

Slide 25 text

No content

Slide 26

Slide 26 text

No content

Slide 27

Slide 27 text

No content

Slide 28

Slide 28 text

No content

Slide 29

Slide 29 text

No content

Slide 30

Slide 30 text

http://www.tylervigen.com/spurious-correlations - Data sources: Centers for Disease Control & Prevention and Internet Movie Database

Slide 31

Slide 31 text

http://www.tylervigen.com/spurious-correlations - Data sources: National Vital Statistics Reports and U.S. Department of Agriculture

Slide 32

Slide 32 text

http://www.tylervigen.com/spurious-correlations - Data sources: National Spelling Bee and Centers for Disease Control & Prevention

Slide 33

Slide 33 text

No content

Slide 34

Slide 34 text

KNOW WHAT QUESTION YOU’RE ASKING UP FRONT

Slide 35

Slide 35 text

USE CONDITIONAL PROBABILITY OVER CORRELATION

Slide 36

Slide 36 text

https://versionone.vc/correlation-probability/

Slide 37

Slide 37 text

No content

Slide 38

Slide 38 text

No content

Slide 39

Slide 39 text

MORTGAGE LENDING ANALYSIS

Slide 40

Slide 40 text

No content

Slide 41

Slide 41 text

No content

Slide 42

Slide 42 text

No content

Slide 43

Slide 43 text

No content

Slide 44

Slide 44 text

https://www.reuters.com/article/us-amazon-com-jobs-automation-insight/amazon-scraps-secret-ai-recruiting-tool-that-showed-bias-against-women-idUSKCN1MK08G

Slide 45

Slide 45 text

Twitter - @quantoidasaurus (Used with permission)

Slide 46

Slide 46 text

YOUR SAMPLE MIGHT NOT BE REPRESENTATIVE

Slide 47

Slide 47 text

YOUR DATA MIGHT NOT BE REPRESENTATIVE

Slide 48

Slide 48 text

No content

Slide 49

Slide 49 text

MODELS REPRESENT WHAT WAS THEY DON’T TELL YOU WHAT SHOULD BE

Slide 50

Slide 50 text

FIND A BETTER DATA SET! CONCEPTNET.IO

Slide 51

Slide 51 text

BUILD A BETTER DATA SET!

Slide 52

Slide 52 text

No content

Slide 53

Slide 53 text

BEWARE SHADOW COLUMNS

Slide 54

Slide 54 text

MAKE SURE YOUR SAMPLE SET IS REPRESENTATIVE

Slide 55

Slide 55 text

No content

Slide 56

Slide 56 text

No content

Slide 57

Slide 57 text

IBM’S AI FAIRNESS TOOLKIT

Slide 58

Slide 58 text

https://aif360.mybluemix.net AI FAIRNESS TOOLKIT

Slide 59

Slide 59 text

https://aif360.mybluemix.net

Slide 60

Slide 60 text

No content

Slide 61

Slide 61 text

No content

Slide 62

Slide 62 text

No content

Slide 63

Slide 63 text

https://aif360.mybluemix.net

Slide 64

Slide 64 text

https://aif360.mybluemix.net

Slide 65

Slide 65 text

https://pair-code.github.io/what-if-tool

Slide 66

Slide 66 text

HAVE A GOOD PROCESS

Slide 67

Slide 67 text

KEEP IN MIND YOU NEED TO KNOW WHO CAN BE AFFECTED IN ORDER TO UN-BIAS

Slide 68

Slide 68 text

No content

Slide 69

Slide 69 text

PRICING ALGORITHMS

Slide 70

Slide 70 text

Calvano, Calzolari, Denicolò and Pastorello (2018)

Slide 71

Slide 71 text

No content

Slide 72

Slide 72 text

No content

Slide 73

Slide 73 text

Calvano, Calzolari, Denicolò and Pastorello (2018)

Slide 74

Slide 74 text

WHAT IF AMAZON BUILT A SALARY TOOL INSTEAD?

Slide 75

Slide 75 text

THE BRATWURST PROBLEM

Slide 76

Slide 76 text

HUMANS ARE RARELY SINGLE-MINDED

Slide 77

Slide 77 text

No content

Slide 78

Slide 78 text

No content

Slide 79

Slide 79 text

https://www.alexirpan.com/2018/02/14/rl-hard.html; Gu, Lillicrap, Sutskever, & Levine, 2016

Slide 80

Slide 80 text

No content

Slide 81

Slide 81 text

MODELS REPRESENT WHAT WAS THEY DON’T TELL YOU WHAT SHOULD BE

Slide 82

Slide 82 text

DON’T TRUST ALGORITHMS TO MAKE SUBTLE OR LARGE MULTI-VARIABLE JUDGEMENTS

Slide 83

Slide 83 text

No content

Slide 84

Slide 84 text

MORE COMPLEX ALGORITHMS THAT INCLUDE OUTSIDE INFLUENCE

Slide 85

Slide 85 text

No content

Slide 86

Slide 86 text

Lehman, Clune, & Misevic, 2018

Slide 87

Slide 87 text

Cheney, MacCurdy, Clune, Lipson, 2013

Slide 88

Slide 88 text

No content

Slide 89

Slide 89 text

BE READY

Slide 90

Slide 90 text

DON’T CONFUSE THE MAP WITH THE TERRITORY

Slide 91

Slide 91 text

VERIFY AND CHECK SOLUTIONS DERIVED FROM SIMULATION

Slide 92

Slide 92 text

No content

Slide 93

Slide 93 text

No content

Slide 94

Slide 94 text

BUT WHAT HAPPENS WITH DIALECTAL LANGUAGE? Blodgett, Green, and O’Connor, 2016

Slide 95

Slide 95 text

MANY AI/ML TOOLS ARE TRAINED TO MINIMIZE AVERAGE LOSS

Slide 96

Slide 96 text

REPRESENTATION DISPARITY Hashimoto, Srivastava, Namkoong, and Liang, 2018

Slide 97

Slide 97 text

No content

Slide 98

Slide 98 text

CONSIDER PREDICTIVE ACCURACY AS A RESOURCE TO BE ALLOCATED Hashimoto, Srivastava, Namkoong, and Liang, 2018

Slide 99

Slide 99 text

DISTRIBUTIONALLY ROBUST OPTIMIZATION Hashimoto, Srivastava, Namkoong, and Liang, 2018

Slide 100

Slide 100 text

No content

Slide 101

Slide 101 text

LET’S BUILD A PRODUCT WITH OUR TWITTER NLP

Slide 102

Slide 102 text

WHAT HAPPENS TO PEOPLE WHO USE DIALECT?

Slide 103

Slide 103 text

PREDICTIVE POLICING

Slide 104

Slide 104 text

Image via Reddit, Author user u/jakeroot

Slide 105

Slide 105 text

Ensign, Friedler, Neville, Scheidegger, & Venkatasubramanian, 2017

Slide 106

Slide 106 text

Ensign, Friedler, Neville, Scheidegger, & Venkatasubramanian, 2017

Slide 107

Slide 107 text

Ensign, Friedler, Neville, Scheidegger, & Venkatasubramanian, 2017

Slide 108

Slide 108 text

Ensign, Friedler, Neville, Scheidegger, & Venkatasubramanian, 2017

Slide 109

Slide 109 text

Ensign, Friedler, Neville, Scheidegger, & Venkatasubramanian, 2017

Slide 110

Slide 110 text

Ensign, Friedler, Neville, Scheidegger, & Venkatasubramanian, 2017

Slide 111

Slide 111 text

No content

Slide 112

Slide 112 text

IGNORE OR ADJUST FOR ALGORITHM-SUGGESTED RESULTS

Slide 113

Slide 113 text

LOOK TO CONTROL ENGINEERING

Slide 114

Slide 114 text

By Arturo Urquizo - http://commons.wikimedia.org/wiki/File:PID.svg, CC BY-SA 3.0, https://commons.wikimedia.org/w/index.php?curid=17633925

Slide 115

Slide 115 text

No content

Slide 116

Slide 116 text

CLASS I - PHANTOMS OF FALSE CORRELATION Know what question you’re asking Trust conditional probability over straight correlation

Slide 117

Slide 117 text

CLASS II - SPECTER OF BIASED SAMPLE DATA Recognize data is biased even at rest Make sure your sample set is crafted properly Excise problematic predictors, but beware their shadow columns Build a learning system that can incorporate false positives and false negatives as you find them Try using adversarial techniques to detect bias

Slide 118

Slide 118 text

CLASS III - SHADE OF OVERLY-SIMPLISTIC MAXIMIZATION Remember models tell you what was, not what should be Try combining dependent columns and predicting that Try complex algorithms that allow more flexible reinforcement

Slide 119

Slide 119 text

CLASS V – THE SIMULATION SURPRISE Don’t confuse the map with the territory Always reality-check solutions from simulations

Slide 120

Slide 120 text

CLASS VI - APPARITION OF FAIRNESS Consider predictive accuracy as a resource to be allocated Possibly seek external auditing of results, or at least another team

Slide 121

Slide 121 text

CLASS VII - THE FEEDBACK DEVIL Ignore or adjust for algorithm-suggested results Look to control engineering for potential answers

Slide 122

Slide 122 text

No content

Slide 123

Slide 123 text

No content

Slide 124

Slide 124 text

MODELS REPRESENT WHAT WAS THEY DON’T TELL YOU WHAT SHOULD BE

Slide 125

Slide 125 text

No content

Slide 126

Slide 126 text

OR GET TRAINING

Slide 127

Slide 127 text

Bootcamps Coursera Udemy Actual Universities

Slide 128

Slide 128 text

No content

Slide 129

Slide 129 text

AI Now Institute Georgetown Law Center on Privacy and Technology Knight Foundation’s AI ethics initiative fast.ai

Slide 130

Slide 130 text

ABIDE BY ETHICS GUIDELINES

Slide 131

Slide 131 text

Privacy / Consent Transparency of Use Transparency of Algorithms Ownership

Slide 132

Slide 132 text

https://www.accenture.com/_acnmedia/PDF-24/Accenture-Universal-Principles-Data-Ethics.pdf

Slide 133

Slide 133 text

Slides: Arthur Doler @arthurdoler arthurdoler@gmail.com Handout: bit.ly/art-ai-with-agenda None