For LIS 510 "Information Security and Privacy"
The story so far…
✦ Contextual integrity theory is Helen Nissenbaum
et al.’s attempt to
gure out when people DO and
DON’T sense a privacy violation.
✦ Also a reaction against US notice-and-consent law, which posits “if
they signed o
on it in any way whatever, it’s not a privacy violation.”
Which is… how shall I put this?… garbage.
✦ Book: Privacy in Context, by Nissenbaum.
✦ It’s a heavy read (though fairly short). Nissenbaum is an ethicist, and
ethicists write the way zambonis clear ice rinks: carefully, thoroughly,
and very very slowly.
✦ It’s also HUGELY in
uential, at least in privacy and ethics scholarship.
(Bit less so in industry, but you’ll see it mentioned.)
✦ It’s more useful than its (sorry!) rather pedantic style might suggest.
Like, why is this even hard?
✦ Because some folks want bright-line privacy
rules: this is always okay, that is always not.
✦ Nissenbaum: “Sorry. It’s not that simple.”
✦ Because we don’t always react to things in
proportion to the actual risks.
✦ Sometimes we FREAK ALL THE WAY OUT about stu
considered carefully, isn’t all that major or scary.
✦ Sometimes we let really horrible privacy-destroying stu
✦ WHYYYYYYYYYYYYY? asked Nissenbaum.
✦ Why is this, in both directions?
✦ And what is it that we actually respond to, if it’s not the risk level?
✦ And what should we be able to expect by way of privacy protection?
Yeah, Dr. N, what should
we be able to expect?
✦ Nissenbaum: it’s our right to live in a world
where our expectations about information
are (for the most part) respected.
✦ Simple as that.
✦ Consider using this idea to prioritize your
campus data report and your communicative
artifact. Lean into situations where your
expectations weren’t respected!
✦ Privacy happens when INFORMATION FLOW about us is
appropriate, and breaks when it’s not.
✦ Who decides what’s appropriate? We all do.
✦ We all have expectations about when it’s okay to share information, or have
other people share information about us.
✦ Our appropriateness decisions are conditioned by SOCIAL
NORMS. These norms change over time.
✦ They’re also conditioned by WHAT WE ACTUALLY UNDERSTAND about information
ows, and I really wish CI theorists gave more time to this.
✦ Both social and individual information-
depend on SPECIFIC SOCIAL CONTEXTS.
✦ Basic example: It’s okay for me to discuss your work in this course with you.
That’s appropriate. Complaining about it on my public Twitter, using your
name? OH HECK NO. Utterly inappropriate!
✦ But it’s the same information! So it’s the context that matters.
So what’s different about this?
✦ Some de
nitions of privacy equate it to secrecy.
Contextual integrity doesn’t!
✦ Information doesn’t have to be totally secret to be private in some way.
✦ Some equate it to compliance with applicable law.
Contextual integrity doesn’t!
✦ An information
ow can be legal and still violate contextual integrity.
✦ Some think privacy means individuals’ control over
their data. Contextual integrity doesn’t!
✦ You might not be in control of a given information
ow, but that doesn’t
mean it bothers you. (If I discuss your group’s project with another
member of your group, does that violate your privacy? I’m guessing
not… even if you don’t even know it’s happening.)
✦ Or you could have control, make a mistake with it, and still feel your
privacy was violated. (Has an online service ever done you dirty?)
The Five Parameters
✦ Per Nissenbaum, people sensing a privacy
violation are reacting to one (or more) of
aspects of a data-sharing transaction:
✦ DATA SUBJECT: who is this data about?
✦ DATA SENDER: who’s sharing the data?
✦ DATA RECIPIENT: who’s receiving the data?
✦ INFORMATION TYPE: (exactly what you think it is)
✦ TRANSMISSION PRINCIPLE (hold that thought; I’ll get to it)
✦ WRITE THESE DOWN, PLEASE.
✦ You’ll want them as you work through your campus data report.
✦ Now I’ll explain them, with examples.
✦ Especially that last one. It’s a bit tricky.
✦ Data that it’s okay to share about one person may
not be at all okay to share about another.
✦ Example: Children. In the US, there are more
legal restrictions on collecting and sharing data
about children than about adults.
✦ The rationale being that children are more vulnerable than adults,
less autonomous, and less able to protect themselves
✦ Example: Folks with limited cognitive capacity
✦ For whatever reason (age, some disabilities, both)
✦ Again, the rationale is that sharing data about vulnerable people who
don’t understand what’s going on well enough to object is Not Okay.
✦ Not everybody’s entitled to share data about you,
even if they got it in a reasonable way.
✦ Example: attorney-client privilege
✦ If you’re paying a lawyer, they need to keep their mouth shut about
what you tell them except for what they must disclose to get the legal
work done that you’re paying them for.
✦ Example: HIPAA, US health-records privacy law
✦ Heavily regulates what HEALTH PROFESSIONALS can share (also when
and under what circumstances)
✦ Doesn’t actually regulate anybody else who gets their paws on your
health data! Got a Fitbit? DUMP IT. Period-tracking app? DELETE IT.
Don’t buy that health-tracking smartwatch, okay? All of these can rat
you out (and actually have ratted people out). HIPAA doesn’t care.
✦ Not everybody is entitled to have your data
shared with them.
✦ Example: FERPA, US educational-record law. It’s
complicated, but for example:
✦ If you’re my advisee, I am allowed to see what courses you’ve taken
and what grades you’ve earned. (I need to know to do my job!) But I
can’t go tell your boss; FERPA will smack me down. Bad recipient!
✦ Not my advisee? Your course choices and grades are none of my
business, unless you actually tell me about them (which is your right)
OR I have a solid, education-related reason I need to know.
✦ (For example: I help assess undergraduates for the Phi Beta Kappa
honor society. I see transcripts, including grades, for students under
consideration. FERPA is
ne with this.)
✦ Without such a reason, FERPA does not consider me an acceptable
✦ Exactly what you think it is. Some data is more
sensitive than other data.
✦ Example: personal identi
✦ Such as (US) social-security numbers, credit-card numbers, passport
and other ID numbers, and so on.
✦ They’re sensitive because if they leak, somebody can do you a whole
lot of harm with them.
✦ Example: health data, again
✦ In the
rst phase of Data Doubles, we found that a lot of our
respondents had trouble coming up with information types that they
considered sensitive enough to restrict campus access to.
✦ There was a notable exception: data about mental-health treatment
or other counseling. Keep that private! several said.
✦ Roughly, “what limitations are there on this
✦ Some variant on “don’t randomly blab it!” is a
really common transmission principle.
✦ Example: encryption in web browsers
✦ Web browsers are BUILT for information
ow! That’s their whole
reason for existing!
✦ Data subject is presumptively
ne (or, breaches and errors and doxers
aside, the info wouldn’t be on the web in the
rst place), data sender
and recipient are
ne, information type is
✦ … but the entire world doesn’t need to be able to peek in on what I’m
ng. Especially if it involves my credit union or my doctor or my
boss. And early web browsers didn’t have any way to prevent that!
✦ They do now. (Up to a point. We’ll talk about it.)
So, how do I use this,
✦ Great question. Two ways, I think.
✦ One: when you learn about something that you
consider a privacy violation, use the Five
Parameters to dig into what feels wrong about it.
✦ Would you be okay if it was a di
erent information type? If they
rst (transmission principle)? If it wasn’t going to That
Person or That Company or Those Cops or That University O
✦ Two: in your campus-data report, evaluate
nd out about information
against the Five Parameters. If something’s not
right, explain precisely why not!
“Tyranny of the normal”
✦ What if social norms for a particular information
ow are broadly accepted, but still kind of bad?
✦ An example to think about: video surveillance of public spaces, especially
when combined with facial-recognition technology.
✦ Strict contextual-integrity theory says “meh, it’s the
norm, let it go.” Not an ideal answer!
✦ This totally happens. Adtech is a great example! Repeated studies now
indicating that once people understand how adtech tracking works and
where the data goes, they HATE it. But it’s still considered normal!
✦ Nissenbaum: take a step back and consider the
reasons this information
ow exists, and the ethics
binding the collecting/sharing parties.
✦ MA/LIS folks: this is where our ethics codes around privacy, equitable
service, and putting patrons over vendors kick in.
Cool? Cool. Thanks!
This presentation is copyright 2021 by Dorothea Salo.
It is available under a
Creative Commons Attribution 4.0