Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Prediction and Control

Prediction and Control

Talk given at 32c3 about the implications of handing over decision making to automated systems

97e3552e2e2b4704ec272f7aff16634e?s=128

redshiftzero

December 28, 2015
Tweet

Transcript

  1. Prediction and Control: Watching Algorithms Jennifer Helsby (@redshiftzero) Postdoctoral Fellow

    Computation Institute University of Chicago 32c3 December 29, 2015
  2. • Algorithmic landscape • Advantages of intelligent systems • Current

    applications • The nightmare case • Properties of algorithmic systems • Fairness • Transparency • Accountability • Paths forward Outline
  3. Technology is neither good not bad; nor is it neutral.

    - Kranzberg’s First Law of Technology
  4. government collection social media IoT devices, sensors medical history voice

    location genetic information biometrics browsing profiles wearables emails and messaging logs
  5. • slow • biased • not transparent • difficult to

    audit • fast • unbiased? • transparent? • can audit? human decision algorithmic decision
  6. Data Inference Action

  7. Algorithmic decision making is expanding from industry into new domains.

  8. None
  9. None
  10. None
  11. VOTE VOTE VOTE VOTE VOTE VOTE VOTE VOTE

  12. data-driven city

  13. Advantages • Automate rote tasks • Enable us to distill

    huge volumes of data down to the critical pieces of information • Optimize resource allocation • Optimize actions to produce desired outcomes
  14. Algorithms can have serious implications.

  15. Chinese Social Credit System ranks each citizen based on their

    behavior to be used by all citizens by 2020 • financial record • criminal history • drivers license history • some medical information • purchase history • social media monitoring • social network analysis • … “carry forward sincerity and traditional virtues”
  16. What do we want?

  17. Privacy Fairness Transparency Accountability What do we want?

  18. I. Fairness

  19. Predictive Policing

  20. Predictive Policing individual level geographic level

  21. None
  22. None
  23. Individual Fairness similar people treated similarly Group Fairness protected groups

    treated similarly Fairness in Errors errors should not be concentrated in protected groups Zemel et al. 2013, Dwork et al. 2011 Fairness Concerns
  24. Training Data Issues • crime databases contain only crimes detected

    by police • effect of biased policing, e.g. black people arrested at rates 3.73x that of whites for marijuana crimes [ACLU]
  25. • Race and location are correlated in the US: machine

    learning systems can learn sensitive features • No opt-out: citizens need more information about these systems
  26. II. Transparency

  27. What are these systems doing? ?

  28. Complexity Algorithms must be human interpretable

  29. Influence

  30. Alerting users to the fact manipulation might be occurring did

    not decrease the amplitude of the effect. Conservative Conservative Liberal Liberal Control Control Manipulate Manipulate Epstein and Robertson 2015 “it is a relatively simple matter to mask the bias in search rankings so that it is undetectable to virtually every user” Influencing Voting Introduced 20% shift in voter preferences
  31. behaviouralinsights.co.uk

  32. behaviouralinsights.co.uk

  33. None
  34. Optimization Industry: • Time spent on website • Click through

    rate • Likes • Profit Politics: • Votes for the desired candidate Policy: • Better use of government services • Voter registration • Health outcomes • Education outcomes • Compliance
  35. The invisible barbed wire of big data limits our lives

    to a space that might look quiet and enticing enough, but is not of our own choosing and that we cannot rebuild or expand. The worst part is that we do not see it as such. Because we believe that we are free to go anywhere, the barbed wire remains invisible. -Morozov
  36. III. Accountability

  37. oversight

  38. None
  39. Greene vs. San Francisco • San Francisco ALPR gets (false)

    hit on car • Traffic stop of 47-year old woman • Officers conducted pat-down, search, held at gunpoint
  40. Auditing

  41. ? Inputs: Test accounts Real accounts (collaborative approach) Outputs: Was

    one output shown to one user and not another?
  42. ? Browsing profile Ad 1 Ad 2 … Ad n

  43. Datta et al 2015 Ad Network Browsing Profile Browsing Profile

    Browsing Profile Browsing Profile Treatment 2 Treatment 1 Ad 1 … Ad n Ad 1 … Ad n Ad 1 … Ad n Ad 1 … Ad n
  44. None
  45. Who is in control?

  46. Audit tools Sunlight: https://columbia.github.io/sunlight/ AdFisher: https://github.com/tadatitam/info-flow-experiments/ Anonymity systems (Tor) View

    the landscape Obfuscate by injecting noise (Adnauseam.io) Technology
  47. Policy Regulation Independent ethics review 3rd party audits

  48. Closing thoughts Algorithmic systems need to have appropriate oversight in

    order to be controlled Hacking and privacy advocacy community has an important role to play in this fight Thanks! Jennifer Helsby @redshiftzero Postdoctoral Fellow Computation Institute University of Chicago
  49. References ACLU, “The War on Marijuana”, https://www.aclu.org/files/assets/aclu- thewaronmarijuana-rel2.pdf Bond, Farris,

    Jones, Kramer, Marlow, Settle and Fowler, “A 61-milion-person experiment in social influence and political mobilization”, Nature 489, 295–298 , 2012 Datta, Tschantz, and Datta, “Automated Experiments on Ad Privacy Settings”, Proceedings on Privacy Enhancing Technologies 2015; 2015 (1):92-112 Dwork, Hardt, Pitassi, Reingold, and Zemel, “Fairness through Awareness”, ITCS '12 Proceedings of the 3rd Innovations in Theoretical Computer Science Conference Epstein and Robertson, “The search engine manipulation effect (SEME) and its possible impact on the outcomes of elections”, PNAS 2015, vol. 112 no. 33 Zemel, Wu, Swersky, Pitassi, and Dwork. “Learning Fair Representations.”, ICML (3), volume 28 of JMLR Proceedings, page 325-333 Chinese Social Credit System: volkskrant.nl/buitenland/china-rates-its-own-citizens-including-online- behaviour~a3979668/ bbc.com/news/world-asia-china-34592186 newyorker.com/news/daily-comment/how-china-wants-to-rate-its-citizens Predictive Policing: azavea.com/blogs/newsletter/v8i5/hunchlab-2-0-defining-the-future-of-predictive- policing/ popsci.com/want-to-prevent-another-ferguson predpol.com Political Manipulation: newstatesman.com/politics/2014/06/facebook-could-decide-election-without- anyone-ever-finding-out politico.com/magazine/story/2015/08/how-google-could-rig-the-2016- election-121548 Twins: wbay.com/2015/10/23/twins-denied-drivers-permit-because-dmv-cant-tell-them- apart/ ALPR: Greene v. San Francisco, 9th Cir. Court Image credit: Freddy Martinez