Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Does this Algorithm have a Soul? - PyData Miami 2019

Does this Algorithm have a Soul? - PyData Miami 2019

“Man is condemned to be free”, said Sartre. We as human beings are doomed and blessed by the power of choice, and some times, our choices may contain prejudice . What if we could pass this responsibility to a machine free of bias? Would we create a fair world our just a monster of our own image?

Vídeo: https://youtu.be/yNcUPORvhv4

Betina Costa

January 11, 2019
Tweet

More Decks by Betina Costa

Other Decks in Technology

Transcript

  1. D O E S T H I S A L G O R I T H M
    H AV E A S O U L ?
    B E T I N A C O S TA

    View Slide

  2. Consultant at ThoughtWorks
    AfroPython Volunteer
    Data Science Student
    Introverted Movie Buff

    View Slide

  3. I N D U S T R I A L R E V O L U T I O N

    View Slide

  4. M A C H I N E L E A R N I N G

    View Slide

  5. O P E R AT I O N
    S E R E N ATA D E A M O R
    S I N G A P O R E
    H E A LT H C A R E

    View Slide

  6. C A N W E E N D B I A S T H R O U G H
    D ATA ?

    View Slide

  7. P R E J U D I C E

    View Slide

  8. B U T I S T H AT T R U E ?

    View Slide

  9. I N C O U RT

    View Slide

  10. P R O P U B L I C A S E A R C H

    View Slide

  11. B U T W H Y T H AT H A P P E N E D ?

    View Slide

  12. • How many of your friends/acquaintances are taking
    drugs illegally ?
    • Was one of your parents ever sent to jail or prison?
    • How often did you get in fights at school?

    View Slide

  13. C R I M E P R E D I C T I N G A L G O R I T H M S

    View Slide

  14. F E E D B A C K L O O P

    View Slide

  15. B R I N G I N G C L O S E R T O H O M E

    View Slide

  16. FA C I A L R E C O G N I T I O N B I A S

    View Slide

  17. S E A R C H E N G I N E S

    View Slide

  18. S T O P M A C H I N E L E A R N I N G ?

    View Slide

  19. D O N ’ T PA N I C

    View Slide

  20. 1 . A C K N O W L E D G E T H E
    P R O B L E M

    View Slide

  21. 2 . S P R E A D T H E
    P R O B L E M

    View Slide

  22. 3 . N O B L A C K B O X E S

    View Slide

  23. 4 . C H A N G E T H E S TAT U S
    Q U O

    View Slide

  24. 5 . B E B E T T E R

    View Slide

  25. – C AT H Y O ' N E I L
    “Algorithms are opinions embedded in code”

    View Slide

  26. T H A N K Y O U !

    View Slide

  27. R E F E R E N C E S
    • Racial Bias and Gender Bias Examples in AI Systems
    • Machine Bias
    • When an algorithm helps send you to prison
    • Artificial intelligence is now used to predict crime. But is it biased?
    • Runaway feedback loops in predictive policing
    • Assessing bias in search engines
    • The Hidden Gender bias in google image searches
    • Unequal representation and gender stereotypes in image search results for
    occupations
    • Algorithms of Oppression - Safiya Umoja Noble
    • Weapons of Math Destruction - Cathy O’Neil

    View Slide