Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Practical Ethics

Practical Ethics

How do you collect data and run experiments on users in an ethical way?

Presented as a keynote at O'Reilly Velocity NYC 2018.

Laura Thomson

October 03, 2018

More Decks by Laura Thomson

Other Decks in Technology


  1. Practical Ethics Laura Thomson laura@mozilla.com @lxt

  2. None
  3. None
  4. None
  5. None
  6. None
  7. –Anonymous commenter “Do you think other browser makers collect this

    type of data?”
  8. Not an ethicist

  9. How To Be Perfect

  10. How To Be Perfect

  11. Practical Ethics

  12. Standard Disclaimers This is what we do. It’s not perfect.

    This approach is open source so you can steal it and make it better. Give us your feedback so we can make it better too.
  13. Lean Data Collect only what you need Keep it for

    the minimum amount of time Don’t violate user expectations
  14. Classes of Data

  15. Category 1: Technical Data Examples: OS, available memory, version number

    Generally okay to collect, opt-out
  16. Category 2: Interaction Data Examples: # of tabs, session length,

    config settings, feature use Generally okay to collect, opt-out.
  17. Category 3: Web Activity Data Example: browsing history Stickier. Usually

    no, but may be possible with mitigation.
  18. Category 4: Highly Sensitive Data Examples: email, username, identifiers Assume

    no. Maybe opt-in with advance notice, user consent, and secondary opt-out.
  19. Collecting data is simple 1. Request for collection 2. Review

    by data steward https://github.com/mozilla/data-review
  20. What is a Data Steward?

  21. “Case Law” Precedent Allows reasoning about data collection Suggests alternatives

  22. Privacy Preserving Data Collection

  23. Experiments

  24. –Rebecca Weiss, Director of Data Science ‘By not performing A/B

    tests before we release new features and products, we are guilty of administering massive uncontrolled experiments upon our users. The only outcome measure that we can observe as a result of these experiments is “how many users have we driven away since we released that feature?”’
  25. None
  26. None
  27. None
  28. Case Studies

  29. None
  30. How’d that happen? Good intentions, road to hell, etc No

    data collected No one felt empowered to say no
  31. What did we learn? More formal process Definition of red

    flags Deeper engineering review Documented escalation paths
  32. None
  33. “Burn it all. Burn it to the ground.”

  34. Fin We can all do better. Learn from your mistakes.

    Steal these ideas. Steward your users’ data wisely. Come ask questions.
  35. References • https://wiki.mozilla.org/Firefox/Data_Collection • https://github.com/mozilla/data-review • https://wiki.mozilla.org/Firefox/Shield/PHD • https://testpilot.firefox.com/ •