Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Data Privacy Day @ Microsoft 2013

Data Privacy Day @ Microsoft 2013

An edited version of the talk I gave at Data Privacy Day at Microsoft, on designing privacy for smartphones. The full paper which will be presented at CHI in April is here: http://patrickgagekelley.com/papers/CHI13android.pdf

305d7a2c6345cedd1247628c7c6c26ec?s=128

Patrick Gage Kelley

January 31, 2013
Tweet

Transcript

  1. Patrick Gage Kelley Designing for Privacy with Smartphone Data @patrickgage

  2. None
  3. 3 Delta is the first company the California Attorney General

    sued for not having a mobile app privacy policy
  4. . 1 2 3 4 5 6 7 8 9

    10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 [EXEMPT FROM FILING FEES UNDER GOVT. CODE SEC. 6103] SuperiJc ( · · California 664 County oi=rl" I t:WFI/a -.nc/stJo DEC 062012 CL£RKo . BY: COURT KAMALA D. HARRIS Attorney General of California ROBERT MORGESTER Senior Assistant Attorney General ADAM MILLER (SBN 168254) Supervising Deputy Attorney General 455 Golden Gate Avenue, Suite 11000 Attorneysfor Plaintiff . 11 GRIER THE PEOPLE OF THE STATE OF CALIFORNIA Deputy Clerk SUPERIOR COURT OF THE STATE OF CALIFORNIA CITY AND COUNTY OF SAN FRANCISCO CGC-12 -52 6741 Case No. COMPLAINT FOR CIVIL PENALTIES, PERMANENT INJUNCTION AND OTHER EQUITABLE RELIEF FOR . VIOLATIONS OF BUSINESS AND PROFESSIONS CODE SECTION 17200 (UNFAIR COMPETITION LAW) THE PEOPLE OF THE STATE OF CALIFORNIA, Plaintiff, v. DELTA AIR LINES, INC., Defendant. . Plaintiff, the People of the State of California, by and through Kamala D. Harris, Attorney General ofthe State of California, alleges the following on information and belief: INTRODUCTION 1. In 1972, the People of the State of California made privacy an "inalienable right" in the California Constitution. (Cal. Canst., Art. I, § 1.) The People have charged the Attorney General with protecting that right. (Cal. Canst., Art. V, § 13.) 2. The innovations of the 21st Century have created new challenges to privacy. Today, consumers regularly use computers, smartphones, tablets, and other electronic devices to share and store sensitive personal information, including their full name, date of birth, contact information, photographs, bank accounts, credit card numbers, and location information. If a I COMPLAINT FOR CIVIL PENALTIES, PERMANENT INJUNCTION AND OTHER EQ"!JITABLE RELIEF 4
  5. . 1 2 3 4 5 6 7 8 9

    10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 [EXEMPT FROM FILING FEES UNDER GOVT. CODE SEC. 6103] SuperiJc ( · · California 664 County oi=rl" I t:WFI/a -.nc/stJo DEC 062012 CL£RKo . BY: COURT KAMALA D. HARRIS Attorney General of California ROBERT MORGESTER Senior Assistant Attorney General ADAM MILLER (SBN 168254) Supervising Deputy Attorney General 455 Golden Gate Avenue, Suite 11000 Attorneysfor Plaintiff . 11 GRIER THE PEOPLE OF THE STATE OF CALIFORNIA Deputy Clerk SUPERIOR COURT OF THE STATE OF CALIFORNIA CITY AND COUNTY OF SAN FRANCISCO CGC-12 -52 6741 Case No. COMPLAINT FOR CIVIL PENALTIES, PERMANENT INJUNCTION AND OTHER EQUITABLE RELIEF FOR . VIOLATIONS OF BUSINESS AND PROFESSIONS CODE SECTION 17200 (UNFAIR COMPETITION LAW) THE PEOPLE OF THE STATE OF CALIFORNIA, Plaintiff, v. DELTA AIR LINES, INC., Defendant. . Plaintiff, the People of the State of California, by and through Kamala D. Harris, Attorney General ofthe State of California, alleges the following on information and belief: INTRODUCTION 1. In 1972, the People of the State of California made privacy an "inalienable right" in the California Constitution. (Cal. Canst., Art. I, § 1.) The People have charged the Attorney General with protecting that right. (Cal. Canst., Art. V, § 13.) 2. The innovations of the 21st Century have created new challenges to privacy. Today, consumers regularly use computers, smartphones, tablets, and other electronic devices to share and store sensitive personal information, including their full name, date of birth, contact information, photographs, bank accounts, credit card numbers, and location information. If a I COMPLAINT FOR CIVIL PENALTIES, PERMANENT INJUNCTION AND OTHER EQ"!JITABLE RELIEF 5 In 1972, the People of the State of California made privacy an “inalienable right” in the California Constitution. (Cal. Const., Art. I, § 1.)
  6. . 1 2 3 4 5 6 7 8 9

    10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 [EXEMPT FROM FILING FEES UNDER GOVT. CODE SEC. 6103] SuperiJc ( · · California 664 County oi=rl" I t:WFI/a -.nc/stJo DEC 062012 CL£RKo . BY: COURT KAMALA D. HARRIS Attorney General of California ROBERT MORGESTER Senior Assistant Attorney General ADAM MILLER (SBN 168254) Supervising Deputy Attorney General 455 Golden Gate Avenue, Suite 11000 Attorneysfor Plaintiff . 11 GRIER THE PEOPLE OF THE STATE OF CALIFORNIA Deputy Clerk SUPERIOR COURT OF THE STATE OF CALIFORNIA CITY AND COUNTY OF SAN FRANCISCO CGC-12 -52 6741 Case No. COMPLAINT FOR CIVIL PENALTIES, PERMANENT INJUNCTION AND OTHER EQUITABLE RELIEF FOR . VIOLATIONS OF BUSINESS AND PROFESSIONS CODE SECTION 17200 (UNFAIR COMPETITION LAW) THE PEOPLE OF THE STATE OF CALIFORNIA, Plaintiff, v. DELTA AIR LINES, INC., Defendant. . Plaintiff, the People of the State of California, by and through Kamala D. Harris, Attorney General ofthe State of California, alleges the following on information and belief: INTRODUCTION 1. In 1972, the People of the State of California made privacy an "inalienable right" in the California Constitution. (Cal. Canst., Art. I, § 1.) The People have charged the Attorney General with protecting that right. (Cal. Canst., Art. V, § 13.) 2. The innovations of the 21st Century have created new challenges to privacy. Today, consumers regularly use computers, smartphones, tablets, and other electronic devices to share and store sensitive personal information, including their full name, date of birth, contact information, photographs, bank accounts, credit card numbers, and location information. If a I COMPLAINT FOR CIVIL PENALTIES, PERMANENT INJUNCTION AND OTHER EQ"!JITABLE RELIEF 6 Today, consumers regularly use computers, smartphones, tablets, and other electronic devices to share and store sensitive personal information...
  7. . 1 2 3 4 5 6 7 8 9

    10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 [EXEMPT FROM FILING FEES UNDER GOVT. CODE SEC. 6103] SuperiJc ( · · California 664 County oi=rl" I t:WFI/a -.nc/stJo DEC 062012 CL£RKo . BY: COURT KAMALA D. HARRIS Attorney General of California ROBERT MORGESTER Senior Assistant Attorney General ADAM MILLER (SBN 168254) Supervising Deputy Attorney General 455 Golden Gate Avenue, Suite 11000 Attorneysfor Plaintiff . 11 GRIER THE PEOPLE OF THE STATE OF CALIFORNIA Deputy Clerk SUPERIOR COURT OF THE STATE OF CALIFORNIA CITY AND COUNTY OF SAN FRANCISCO CGC-12 -52 6741 Case No. COMPLAINT FOR CIVIL PENALTIES, PERMANENT INJUNCTION AND OTHER EQUITABLE RELIEF FOR . VIOLATIONS OF BUSINESS AND PROFESSIONS CODE SECTION 17200 (UNFAIR COMPETITION LAW) THE PEOPLE OF THE STATE OF CALIFORNIA, Plaintiff, v. DELTA AIR LINES, INC., Defendant. . Plaintiff, the People of the State of California, by and through Kamala D. Harris, Attorney General ofthe State of California, alleges the following on information and belief: INTRODUCTION 1. In 1972, the People of the State of California made privacy an "inalienable right" in the California Constitution. (Cal. Canst., Art. I, § 1.) The People have charged the Attorney General with protecting that right. (Cal. Canst., Art. V, § 13.) 2. The innovations of the 21st Century have created new challenges to privacy. Today, consumers regularly use computers, smartphones, tablets, and other electronic devices to share and store sensitive personal information, including their full name, date of birth, contact information, photographs, bank accounts, credit card numbers, and location information. If a I COMPLAINT FOR CIVIL PENALTIES, PERMANENT INJUNCTION AND OTHER EQ"!JITABLE RELIEF 7 Today, consumers regularly use computers, smartphones, tablets, and other electronic devices to share and store sensitive personal information... ...including their full name, date of birth, contact information, photographs, bank accounts, credit card numbers, and location information.
  8. . 1 2 3 4 5 6 7 8 9

    10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 [EXEMPT FROM FILING FEES UNDER GOVT. CODE SEC. 6103] SuperiJc ( · · California 664 County oi=rl" I t:WFI/a -.nc/stJo DEC 062012 CL£RKo . BY: COURT KAMALA D. HARRIS Attorney General of California ROBERT MORGESTER Senior Assistant Attorney General ADAM MILLER (SBN 168254) Supervising Deputy Attorney General 455 Golden Gate Avenue, Suite 11000 Attorneysfor Plaintiff . 11 GRIER THE PEOPLE OF THE STATE OF CALIFORNIA Deputy Clerk SUPERIOR COURT OF THE STATE OF CALIFORNIA CITY AND COUNTY OF SAN FRANCISCO CGC-12 -52 6741 Case No. COMPLAINT FOR CIVIL PENALTIES, PERMANENT INJUNCTION AND OTHER EQUITABLE RELIEF FOR . VIOLATIONS OF BUSINESS AND PROFESSIONS CODE SECTION 17200 (UNFAIR COMPETITION LAW) THE PEOPLE OF THE STATE OF CALIFORNIA, Plaintiff, v. DELTA AIR LINES, INC., Defendant. . Plaintiff, the People of the State of California, by and through Kamala D. Harris, Attorney General ofthe State of California, alleges the following on information and belief: INTRODUCTION 1. In 1972, the People of the State of California made privacy an "inalienable right" in the California Constitution. (Cal. Canst., Art. I, § 1.) The People have charged the Attorney General with protecting that right. (Cal. Canst., Art. V, § 13.) 2. The innovations of the 21st Century have created new challenges to privacy. Today, consumers regularly use computers, smartphones, tablets, and other electronic devices to share and store sensitive personal information, including their full name, date of birth, contact information, photographs, bank accounts, credit card numbers, and location information. If a I COMPLAINT FOR CIVIL PENALTIES, PERMANENT INJUNCTION AND OTHER EQ"!JITABLE RELIEF 8 Delta’s mobile application may be used to check-in online for an airplane flight, view reservations for air travel, rebook cancelled or missed flights, pay for checked baggage, track checked baggage, access a user’s frequent flyer account, take photographs, and even save a user’s geo-location.
  9. . 1 2 3 4 5 6 7 8 9

    10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 [EXEMPT FROM FILING FEES UNDER GOVT. CODE SEC. 6103] SuperiJc ( · · California 664 County oi=rl" I t:WFI/a -.nc/stJo DEC 062012 CL£RKo . BY: COURT KAMALA D. HARRIS Attorney General of California ROBERT MORGESTER Senior Assistant Attorney General ADAM MILLER (SBN 168254) Supervising Deputy Attorney General 455 Golden Gate Avenue, Suite 11000 Attorneysfor Plaintiff . 11 GRIER THE PEOPLE OF THE STATE OF CALIFORNIA Deputy Clerk SUPERIOR COURT OF THE STATE OF CALIFORNIA CITY AND COUNTY OF SAN FRANCISCO CGC-12 -52 6741 Case No. COMPLAINT FOR CIVIL PENALTIES, PERMANENT INJUNCTION AND OTHER EQUITABLE RELIEF FOR . VIOLATIONS OF BUSINESS AND PROFESSIONS CODE SECTION 17200 (UNFAIR COMPETITION LAW) THE PEOPLE OF THE STATE OF CALIFORNIA, Plaintiff, v. DELTA AIR LINES, INC., Defendant. . Plaintiff, the People of the State of California, by and through Kamala D. Harris, Attorney General ofthe State of California, alleges the following on information and belief: INTRODUCTION 1. In 1972, the People of the State of California made privacy an "inalienable right" in the California Constitution. (Cal. Canst., Art. I, § 1.) The People have charged the Attorney General with protecting that right. (Cal. Canst., Art. V, § 13.) 2. The innovations of the 21st Century have created new challenges to privacy. Today, consumers regularly use computers, smartphones, tablets, and other electronic devices to share and store sensitive personal information, including their full name, date of birth, contact information, photographs, bank accounts, credit card numbers, and location information. If a I COMPLAINT FOR CIVIL PENALTIES, PERMANENT INJUNCTION AND OTHER EQ"!JITABLE RELIEF 9 Despite collecting substantial personally identifiable information ("PII") such as a user's full name, telephone number, email address, frequent flyer account number and PIN code, photographs, and geo-location, the Fly Delta application does not have a privacy policy. It does not have a privacy policy in the application itself, in the platform stores from which the application may be downloaded, or on Delta's website.
  10. 10 “Losing your personal privacy should not be the cost

    of using mobile apps, but all too often it is,” Kamala Harris California Attorney General
  11. 11 But we already know people don’t read privacy policies.

    Privacy Policies as Decision-Making Tools: An Evaluation of Online Privacy Notices Carlos Jensen, Colin Potts GVU Center, College of Computing The Georgia Institute of Technology Atlanta, GA 30332, USA {carlosj, potts} @cc.gatech.edu +1-404-894-5551 ABSTRACT Studies have repeatedly shown that users are increasingly concerned about their privacy when they go online. In response to both public interest and regulatory pressures, privacy policies have become almost ubiquitous. An estimated 77% of websites now post a privacy policy. These policies differ greatly from site to site, and often address issues that are different from those that users care about. They are in most cases the users’ only source of information. This paper evaluates the usability of online privacy policies, as well as the practice of posting them. We analyze 64 current privacy policies, their accessibility, writing, content and evolution over time. We examine how well these policies meet user needs and how they can be improved. We determine that significant changes need to be made to current practice to meet regulatory and usability requirements. Author Keywords Privacy, WWW, e-commerce, Usability, Consent, Readability ACM Classification Keywords H5.2 [Information Interfaces and Presentation]: User Interfaces – Evaluation, Usability; H5.4 [Information Interfaces and Presentation]: Hypertext/Hypermedia – User Issues INTRODUCTION Studies have repeatedly shown that users are increasingly concerned about their privacy when they go online. In a 2001 survey, 70% of respondents said they worried about their online privacy [9]. In a separate study, 69% said that they were “concerned about [online] privacy invasions and try to take action to prevent them from happening to [them]” [5]. This concern may not be unfounded. According to a recent study (91%) of U.S. Web sites collect personal information and 90% collect personally identifying information [1]. In response to public interest and regulatory pressures, privacy policies have become almost ubiquitous. The Progress and Freedom Foundation recently surveyed a sample of highly visited websites and found that 77% of those websites posted a privacy policy [1]. Website privacy policies are meant to inform consumers about business and privacy practices and serve as a basis for decision making for consumers. Not only are privacy policies important for decision making, they are often the only source of information. Policies therefore present an important challenge in terms of HCI; how to convey a lot of complicated but critical information without overwhelming users. We know there are several common problems with policies today, including a frequent mismatch between the issues companies wish to address in their policies, and what users want to know about business practices. Part of the reason for this, and why privacy policies differ greatly from site to site is a lack regulation or industry standards. This applies both in terms of the language used in the policies and the issues they address. This lack of standardization makes it difficult to compare and contrast policies, thereby decreasing their value to users. This issue of standards and regulations is slowly changing as different industries have become more tightly regulated in terms of privacy (e.g. Healthcare through the Health Insurance Portability and Accountability Act of 1996 (HIPAA) [15], finance through the Gramm-Leach-Bliley Act of 1999 (GLBA) [14], and the Children’s Online Privacy Protection Act of 1998 (COPPA) [13] for children). Industry standards have also emerged in the form of privacy certification services, also known as “privacy seals.” These are run either by independent companies or Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. CHI 2004, April 24–29, 2004, Vienna, Austria. Copyright 2004 ACM 1-58113-702-8/04/0004…$5.00. CHI 2004 ʜ Paper 24-29 April ʜ Vienna, Austria Volume 6, Number  471 Economics of Information Security Privacy and Rationality in Individual Decision Making Alessandro Acquisti Carnegie Mellon University Jens Grossklags University of California, Berkeley Traditional theory suggests consumers should be able to manage their privacy. Yet, empirical and theoretical research suggests that consumers often lack enough information to make privacy-sensitive decisions and, even with sufficient information, are likely to trade off long-term privacy for short-term benefits. To appear in: IEEE Security & Privacy, January/February 2005, pp. 24-30. From its early days1,2 to more recent incarnations, economic studies of privacy have viewed individuals as rational economic agents who go about deciding how to protect or divulge their personal information. According to that view, individuals are forward lookers, utility maximizers, Bayesian updaters who are fully informed or base their decisions on probabilities coming from known random distributions. (Some recent works3,4 contrast myopic and fully rational consumers, but focus on the latter.) This approach also permeates the policy debate, in which many believe not only that individuals and organizations should have the right to manage privacy trade-offs without regulative intervention, but that individuals can, in fact, use that right in their own best interest. However, although several empirical studies have reported growing privacy concerns across the US population,5,6 recent surveys, anecdotal evidence, and experiments7–10 have highlighted an apparent dichotomy between privacy attitudes and actual behavior. First, individuals are willing to trade privacy for convenience or bargain the release of personal information in exchange for relatively small rewards. Second, individuals are seldom willing to adopt privacy protective technologies. Our research combines theoretical and empirical approaches to investigate the drivers and apparent inconsistencies of privacy decision making and behavior. We present the theoretical groundings to critique A Comparative Study of Online Privacy Polic and Formats Aleecia M. McDonald,1 Robert W. Reeder,2 Patrick Gage Kelley,1 Lorrie Faith Cranor1 1 Carnegie Mellon, Pittsburgh, PA 2 Microsoft, Redmond, WA AUTHORS PRE-PRESS VERSION Please cite to the published paper available from: http://www.springer.de/comp/lncs/index.html Abstract. Online privacy policies are di⇤cult to understand. Most pri- vacy policies require a college reading level and an ability to decode legalistic, confusing, or jargon-laden phrases. Privacy researchers and in- dustry groups have devised several standardized privacy policy formats to address these issues and help people compare policies. We evaluated three formats in this paper: layered policies, which present a short form with standardized components in addition to a full policy; the Privacy Finder privacy report, which standardizes the text descriptions of privacy practices in a brief bulleted format; and conventional non-standardized human-readable policies. We contrasted six companies’ policies, delib- erately selected to span the range from unusually readable to challeng- ing. Based on the results of our online study of 749 Internet users, we found participants were not able to reliably understand companies’ pri- vacy practices with any of the formats. Compared to natural language, participants were faster with standardized formats but at the expense of accuracy for layered policies. Privacy Finder formats supported accuracy more than natural language for harder questions. Improved readability scores did not translate to improved performance. All formats and poli- cies were similarly disliked. We discuss our findings as well as public policy implications. Funded by NSF Cyber Trust grant CNS-0627513, Microsoft through the Car Mellon Center for Computational Thinking, Army Research O⇤ce grant ber DAAD19-02-1-0389 to Carnegie Mellon CyLab, and FCT through CMU/Portugal Information and Communication Technologies Institute. Than Robert McGuire and Keisha How for programming assistance.
  12. 12 Phase 1 20-participant exploratory interviews Phase 2 Several 50-participant

    MTurk iterations Phase 3 20-participant laboratory interview and application selection experiment Phase 4 250-participant MTurk application selection experiment and survey Four Phases of Testing
  13. Apps that come on the phone Apps that come from

    a trusted/ already known brand Apps that are picked from the market to fill a need 13
  14. 14 Apps that come on the phone The most used

    apps: phone, mail, text messaging, weather, directions, maps... But also includes many apps users wish they could remove
  15. Apps that come from a trusted/ already known brand: Facebook,

    Twitter, Pandora, Spotify, Angry Birds, The New York Times, Words with Friends, ESPN, etc... 15
  16. Apps that are picked from the market to fill a

    need How do users make this decision? 16
  17. 17

  18. 18

  19. 19 Across our studies participants: wanted bigger and more screenshots

    other apps users viewed/installed quoted the text descriptions mentioned salient reviews wanted to read more reviews occasionally discussed update frequency/size referred to specific apps they knew of How users say they pick apps
  20. 20 How users report they pick apps ratings user reviews

    price branding and design word of mouth # downloads popularity permissions size of the app developer/company advertising 0% 25% 50% 75% 100% Very important Not important
  21. Why not permissions? 21 - Users do not understand Android

    permissions - The terms used are: - vague or confusing - sometimes misleading - jargon-filled - poorly grouped - The permissions appear after the user has pressed “download,” making their decision
  22. 22 privacy

  23. 23 Android permissions screens

  24. 24 privacy The privacy information could be included on this

    screen
  25. 25 privacy privacy

  26. 26 meters highlights icons checklist

  27. 27 So we have two possible issues: Format Position privacy

  28. 28

  29. 29

  30. Privacy Facts Checklist • Bold header “Privacy Facts” • Eight

    types of information • Advertising and analytics • Checkbox next to each • Immediately after the Description section • Immediately before the Reviews section 30
  31. Application Selection Task • Privacy Facts Checklist v. Android Market

    • Users select one app per category • Each category has two apps • One requests less permissions 31 — Calorie tracking — Word game — Streaming music — Twitter — Document scanning — Flight tracker
  32. 4 stars 10,000-50,000 downloads 3 similar reviews Category Differences 32

    — Calorie tracking — Word game — Twitter — Document scanning — Streaming music (brand) 50 million downloads — Flight tracker (3 stars)
  33. Application Selection (MTurk) 33 Word game Nutrition Twitter Document scanning

    Music Flight tracking n = 366 brand 3/4 Privacy Facts Checklist 61% 73% 53% 60% 29% 35% Permissions 41% 56% 25% 73% 18% 41% Permissions Inline 50% 73% 35% 63% 23% 37%
  34. With the checklist, people are more often selecting the application

    that accesses less permissions though other factors like brand and rating are stronger or remove the effect 34
  35. With the privacy checklist • No one thought the new

    display was out of place • No one stated permissions were missing 35
  36. People said it wasn’t useful It didn’t influence my decision

    even though I noticed it. I tend to pay more attention to ratings and usefulness then anything else.” No, not really. It’s not the most important factor. I don’t keep a bunch of vital personal info on my phone, so no worries. I think people who do are really stupid.” 36 “ “
  37. People said it was useful Yes. It only influenced me

    if it seemed to be the only thing to distinguish between the two apps.” Yeah, I always check that stuff. I want to know exactly what is happening to and with my data from that program when I use it. It was useful though I wish some apps would go into greater detail about why certain things are there.” 37 “ “
  38. Not concerned with data sharing • All their data is

    already out there • Android/Google are protecting them 38 Participants wanted reasons • Watching out for apps that take too much • ...but will make up reasons when asked why an app might need a certain permission
  39. Overall, privacy information at decision time helps users • More

    likely to mention “information” or “data” • Said they would be more likely to consider privacy • The checklist influences app selection • The formatting and terms used change the effect 39 And format matters
  40. So, what’s next? • Need a second revision to clarify

    terms • Testing with more (than 2) apps • Testing with more market functionality • Larger scale test • Automated privacy checklist/ratings 40
  41. This work will be published at ACM SIGCHI 2013 on

    April 27th in Paris for more details see the draft paper at 41 http://patrickgagekelley.com/ papers/CHI13android.pdf
  42. Patrick Gage Kelley @patrickgage me@patrickgage.com patrickgagekelley.com Lorrie Faith Cranor Alessandro

    Acquisti Seungyeop Han Matthew Kay Michelle Mazurek David Wetherall S P E C I A L T H A N K S T O Janice Tsai Privacy Manager, Microsoft Norman Sadeh Sunny Consolvo Jaeyeon Jung Jialiu Lin Manya Sleeper Tim Vidas
  43. This work was supported in part by: U.S. Army Research

    Office (DAAD19-02-1-0389 and W911NF-09-1-0273) NSF Cyber Trust grant CNS-0627513 (Nudging Users Towards Privacy) CNS-0831428, CNS-0905562, CNS-1012763 DGE-0903659 (IGERT: Usable Privacy and Security) Microsoft through the Carnegie Mellon Center for Computational Thinking, FCT through the CMU/Portugal ICTI IBM OCR project on Privacy and Security Policy Management. Google Intel Labs Seattle The University of Washington The University of New Mexico Carnegie Mellon’s CyLab