Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Talking about contextual integrity

Talking about contextual integrity

For LIS 510 "Information Security and Privacy"

Dorothea Salo

June 22, 2021
Tweet

More Decks by Dorothea Salo

Other Decks in Education

Transcript

  1. Talking about


    contextual integrity
    Dorothea Salo

    View Slide

  2. The story so far…
    ✦ Contextual integrity theory is Helen Nissenbaum
    et al.’s attempt to
    fi
    gure out when people DO and
    DON’T sense a privacy violation.


    ✦ Also a reaction against US notice-and-consent law, which posits “if
    they signed o
    ff
    on it in any way whatever, it’s not a privacy violation.”
    Which is… how shall I put this?… garbage.


    ✦ Book: Privacy in Context, by Nissenbaum.


    ✦ It’s a heavy read (though fairly short). Nissenbaum is an ethicist, and
    ethicists write the way zambonis clear ice rinks: carefully, thoroughly,
    and very very slowly.


    ✦ It’s also HUGELY in
    fl
    uential, at least in privacy and ethics scholarship.
    (Bit less so in industry, but you’ll see it mentioned.)


    ✦ It’s more useful than its (sorry!) rather pedantic style might suggest.

    View Slide

  3. Like, why is this even hard?
    ✦ Because some folks want bright-line privacy
    rules: this is always okay, that is always not.


    ✦ Nissenbaum: “Sorry. It’s not that simple.”


    ✦ Because we don’t always react to things in
    proportion to the actual risks.


    ✦ Sometimes we FREAK ALL THE WAY OUT about stu
    ff
    that,
    considered carefully, isn’t all that major or scary.


    ✦ Sometimes we let really horrible privacy-destroying stu
    ff
    slide.


    ✦ WHYYYYYYYYYYYYY? asked Nissenbaum.


    ✦ Why is this, in both directions?


    ✦ And what is it that we actually respond to, if it’s not the risk level?


    ✦ And what should we be able to expect by way of privacy protection?

    View Slide

  4. Yeah, Dr. N, what should
    we be able to expect?
    ✦ Nissenbaum: it’s our right to live in a world
    where our expectations about information
    fl
    ow
    are (for the most part) respected.


    ✦ Simple as that.


    ✦ Consider using this idea to prioritize your
    campus data report and your communicative
    artifact. Lean into situations where your
    expectations weren’t respected!

    View Slide

  5. CI basics!
    ✦ Privacy happens when INFORMATION FLOW about us is
    appropriate, and breaks when it’s not.


    ✦ Who decides what’s appropriate? We all do.


    ✦ We all have expectations about when it’s okay to share information, or have
    other people share information about us.


    ✦ Our appropriateness decisions are conditioned by SOCIAL
    NORMS. These norms change over time.


    ✦ They’re also conditioned by WHAT WE ACTUALLY UNDERSTAND about information
    fl
    ows, and I really wish CI theorists gave more time to this.


    ✦ Both social and individual information-
    fl
    ow norms
    depend on SPECIFIC SOCIAL CONTEXTS.


    ✦ Basic example: It’s okay for me to discuss your work in this course with you.
    That’s appropriate. Complaining about it on my public Twitter, using your
    name? OH HECK NO. Utterly inappropriate!


    ✦ But it’s the same information! So it’s the context that matters.

    View Slide

  6. So what’s different about this?
    ✦ Some de
    fi
    nitions of privacy equate it to secrecy.
    Contextual integrity doesn’t!


    ✦ Information doesn’t have to be totally secret to be private in some way.


    ✦ Some equate it to compliance with applicable law.
    Contextual integrity doesn’t!


    ✦ An information
    fl
    ow can be legal and still violate contextual integrity.


    ✦ Some think privacy means individuals’ control over
    their data. Contextual integrity doesn’t!


    ✦ You might not be in control of a given information
    fl
    ow, but that doesn’t
    mean it bothers you. (If I discuss your group’s project with another
    member of your group, does that violate your privacy? I’m guessing
    not… even if you don’t even know it’s happening.)


    ✦ Or you could have control, make a mistake with it, and still feel your
    privacy was violated. (Has an online service ever done you dirty?)

    View Slide

  7. The Five Parameters
    ✦ Per Nissenbaum, people sensing a privacy
    violation are reacting to one (or more) of
    fi
    ve
    aspects of a data-sharing transaction:


    ✦ DATA SUBJECT: who is this data about?


    ✦ DATA SENDER: who’s sharing the data?


    ✦ DATA RECIPIENT: who’s receiving the data?


    ✦ INFORMATION TYPE: (exactly what you think it is)


    ✦ TRANSMISSION PRINCIPLE (hold that thought; I’ll get to it)


    ✦ WRITE THESE DOWN, PLEASE.


    ✦ You’ll want them as you work through your campus data report.


    ✦ Now I’ll explain them, with examples.


    ✦ Especially that last one. It’s a bit tricky.

    View Slide

  8. Data subject
    ✦ Data that it’s okay to share about one person may
    not be at all okay to share about another.


    ✦ Example: Children. In the US, there are more
    legal restrictions on collecting and sharing data
    about children than about adults.


    ✦ The rationale being that children are more vulnerable than adults,
    less autonomous, and less able to protect themselves


    ✦ Example: Folks with limited cognitive capacity


    ✦ For whatever reason (age, some disabilities, both)


    ✦ Again, the rationale is that sharing data about vulnerable people who
    don’t understand what’s going on well enough to object is Not Okay.

    View Slide

  9. Data sender
    ✦ Not everybody’s entitled to share data about you,
    even if they got it in a reasonable way.


    ✦ Example: attorney-client privilege


    ✦ If you’re paying a lawyer, they need to keep their mouth shut about
    what you tell them except for what they must disclose to get the legal
    work done that you’re paying them for.


    ✦ Example: HIPAA, US health-records privacy law


    ✦ Heavily regulates what HEALTH PROFESSIONALS can share (also when
    and under what circumstances)


    ✦ Doesn’t actually regulate anybody else who gets their paws on your
    health data! Got a Fitbit? DUMP IT. Period-tracking app? DELETE IT.
    Don’t buy that health-tracking smartwatch, okay? All of these can rat
    you out (and actually have ratted people out). HIPAA doesn’t care.

    View Slide

  10. Data recipient
    ✦ Not everybody is entitled to have your data
    shared with them.


    ✦ Example: FERPA, US educational-record law. It’s
    complicated, but for example:


    ✦ If you’re my advisee, I am allowed to see what courses you’ve taken
    and what grades you’ve earned. (I need to know to do my job!) But I
    can’t go tell your boss; FERPA will smack me down. Bad recipient!


    ✦ Not my advisee? Your course choices and grades are none of my
    business, unless you actually tell me about them (which is your right)
    OR I have a solid, education-related reason I need to know.


    ✦ (For example: I help assess undergraduates for the Phi Beta Kappa
    honor society. I see transcripts, including grades, for students under
    consideration. FERPA is
    fi
    ne with this.)


    ✦ Without such a reason, FERPA does not consider me an acceptable
    data recipient.

    View Slide

  11. Information type
    ✦ Exactly what you think it is. Some data is more
    sensitive than other data.


    ✦ Example: personal identi
    fi
    ers


    ✦ Such as (US) social-security numbers, credit-card numbers, passport
    and other ID numbers, and so on.


    ✦ They’re sensitive because if they leak, somebody can do you a whole
    lot of harm with them.


    ✦ Example: health data, again


    ✦ In the
    fi
    rst phase of Data Doubles, we found that a lot of our
    respondents had trouble coming up with information types that they
    considered sensitive enough to restrict campus access to.


    ✦ There was a notable exception: data about mental-health treatment
    or other counseling. Keep that private! several said.

    View Slide

  12. Transmission principle
    ✦ Roughly, “what limitations are there on this
    information
    fl
    ow?”


    ✦ Some variant on “don’t randomly blab it!” is a
    really common transmission principle.


    ✦ Example: encryption in web browsers


    ✦ Web browsers are BUILT for information
    fl
    ow! That’s their whole
    reason for existing!


    ✦ Data subject is presumptively
    fi
    ne (or, breaches and errors and doxers
    aside, the info wouldn’t be on the web in the
    fi
    rst place), data sender
    and recipient are
    fi
    ne, information type is
    fi
    ne…


    ✦ … but the entire world doesn’t need to be able to peek in on what I’m
    sur
    fi
    ng. Especially if it involves my credit union or my doctor or my
    boss. And early web browsers didn’t have any way to prevent that!


    ✦ They do now. (Up to a point. We’ll talk about it.)

    View Slide

  13. So, how do I use this,
    Dorothea?
    ✦ Great question. Two ways, I think.


    ✦ One: when you learn about something that you
    consider a privacy violation, use the Five
    Parameters to dig into what feels wrong about it.


    ✦ Would you be okay if it was a di
    ff
    erent information type? If they
    asked you
    fi
    rst (transmission principle)? If it wasn’t going to That
    Person or That Company or Those Cops or That University O
    ffi
    ce?


    ✦ Two: in your campus-data report, evaluate
    everything you
    fi
    nd out about information
    fl
    ows
    against the Five Parameters. If something’s not
    right, explain precisely why not!

    View Slide

  14. “Tyranny of the normal”
    ✦ What if social norms for a particular information
    fl
    ow are broadly accepted, but still kind of bad?


    ✦ An example to think about: video surveillance of public spaces, especially
    when combined with facial-recognition technology.


    ✦ Strict contextual-integrity theory says “meh, it’s the
    norm, let it go.” Not an ideal answer!


    ✦ This totally happens. Adtech is a great example! Repeated studies now
    indicating that once people understand how adtech tracking works and
    where the data goes, they HATE it. But it’s still considered normal!


    ✦ Nissenbaum: take a step back and consider the
    reasons this information
    fl
    ow exists, and the ethics
    binding the collecting/sharing parties.


    ✦ MA/LIS folks: this is where our ethics codes around privacy, equitable
    service, and putting patrons over vendors kick in.

    View Slide

  15. Cool? Cool. Thanks!
    This presentation is copyright 2021 by Dorothea Salo.




    It is available under a


    Creative Commons Attribution 4.0


    International license.

    View Slide