Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Prediction and Control

Prediction and Control

Talk given at 32c3 about the implications of handing over decision making to automated systems

redshiftzero

December 28, 2015
Tweet

More Decks by redshiftzero

Other Decks in Technology

Transcript

  1. Prediction and Control: Watching Algorithms Jennifer Helsby (@redshiftzero) Postdoctoral Fellow

    Computation Institute University of Chicago 32c3 December 29, 2015
  2. • Algorithmic landscape • Advantages of intelligent systems • Current

    applications • The nightmare case • Properties of algorithmic systems • Fairness • Transparency • Accountability • Paths forward Outline
  3. Technology is neither good not bad; nor is it neutral.

    - Kranzberg’s First Law of Technology
  4. government collection social media IoT devices, sensors medical history voice

    location genetic information biometrics browsing profiles wearables emails and messaging logs
  5. • slow • biased • not transparent • difficult to

    audit • fast • unbiased? • transparent? • can audit? human decision algorithmic decision
  6. Advantages • Automate rote tasks • Enable us to distill

    huge volumes of data down to the critical pieces of information • Optimize resource allocation • Optimize actions to produce desired outcomes
  7. Chinese Social Credit System ranks each citizen based on their

    behavior to be used by all citizens by 2020 • financial record • criminal history • drivers license history • some medical information • purchase history • social media monitoring • social network analysis • … “carry forward sincerity and traditional virtues”
  8. Individual Fairness similar people treated similarly Group Fairness protected groups

    treated similarly Fairness in Errors errors should not be concentrated in protected groups Zemel et al. 2013, Dwork et al. 2011 Fairness Concerns
  9. Training Data Issues • crime databases contain only crimes detected

    by police • effect of biased policing, e.g. black people arrested at rates 3.73x that of whites for marijuana crimes [ACLU]
  10. • Race and location are correlated in the US: machine

    learning systems can learn sensitive features • No opt-out: citizens need more information about these systems
  11. Alerting users to the fact manipulation might be occurring did

    not decrease the amplitude of the effect. Conservative Conservative Liberal Liberal Control Control Manipulate Manipulate Epstein and Robertson 2015 “it is a relatively simple matter to mask the bias in search rankings so that it is undetectable to virtually every user” Influencing Voting Introduced 20% shift in voter preferences
  12. Optimization Industry: • Time spent on website • Click through

    rate • Likes • Profit Politics: • Votes for the desired candidate Policy: • Better use of government services • Voter registration • Health outcomes • Education outcomes • Compliance
  13. The invisible barbed wire of big data limits our lives

    to a space that might look quiet and enticing enough, but is not of our own choosing and that we cannot rebuild or expand. The worst part is that we do not see it as such. Because we believe that we are free to go anywhere, the barbed wire remains invisible. -Morozov
  14. Greene vs. San Francisco • San Francisco ALPR gets (false)

    hit on car • Traffic stop of 47-year old woman • Officers conducted pat-down, search, held at gunpoint
  15. Datta et al 2015 Ad Network Browsing Profile Browsing Profile

    Browsing Profile Browsing Profile Treatment 2 Treatment 1 Ad 1 … Ad n Ad 1 … Ad n Ad 1 … Ad n Ad 1 … Ad n
  16. Closing thoughts Algorithmic systems need to have appropriate oversight in

    order to be controlled Hacking and privacy advocacy community has an important role to play in this fight Thanks! Jennifer Helsby @redshiftzero Postdoctoral Fellow Computation Institute University of Chicago
  17. References ACLU, “The War on Marijuana”, https://www.aclu.org/files/assets/aclu- thewaronmarijuana-rel2.pdf Bond, Farris,

    Jones, Kramer, Marlow, Settle and Fowler, “A 61-milion-person experiment in social influence and political mobilization”, Nature 489, 295–298 , 2012 Datta, Tschantz, and Datta, “Automated Experiments on Ad Privacy Settings”, Proceedings on Privacy Enhancing Technologies 2015; 2015 (1):92-112 Dwork, Hardt, Pitassi, Reingold, and Zemel, “Fairness through Awareness”, ITCS '12 Proceedings of the 3rd Innovations in Theoretical Computer Science Conference Epstein and Robertson, “The search engine manipulation effect (SEME) and its possible impact on the outcomes of elections”, PNAS 2015, vol. 112 no. 33 Zemel, Wu, Swersky, Pitassi, and Dwork. “Learning Fair Representations.”, ICML (3), volume 28 of JMLR Proceedings, page 325-333 Chinese Social Credit System: volkskrant.nl/buitenland/china-rates-its-own-citizens-including-online- behaviour~a3979668/ bbc.com/news/world-asia-china-34592186 newyorker.com/news/daily-comment/how-china-wants-to-rate-its-citizens Predictive Policing: azavea.com/blogs/newsletter/v8i5/hunchlab-2-0-defining-the-future-of-predictive- policing/ popsci.com/want-to-prevent-another-ferguson predpol.com Political Manipulation: newstatesman.com/politics/2014/06/facebook-could-decide-election-without- anyone-ever-finding-out politico.com/magazine/story/2015/08/how-google-could-rig-the-2016- election-121548 Twins: wbay.com/2015/10/23/twins-denied-drivers-permit-because-dmv-cant-tell-them- apart/ ALPR: Greene v. San Francisco, 9th Cir. Court Image credit: Freddy Martinez