Upgrade to Pro — share decks privately, control downloads, hide ads and more …

How do we not be Facebook? (With notes.)

Dorothea Salo
September 17, 2021

How do we not be Facebook? (With notes.)

Given for a speaker series on data and privacy at UW-Superior.

Dorothea Salo

September 17, 2021
Tweet

More Decks by Dorothea Salo

Other Decks in Technology

Transcript

  1. How do we not be
    Dorothea Salo


    Distinguished Faculty Associate, UW-Madison iSchool
    [email protected]


    The Data Deluge: Privacy in a Connected World, UW-Superior, September 2021
    FACEBOOK?!
    1
    Hi, folks, and my thanks to the series organizers for inviting me. Also huge thanks to Julie and Stephanie for making all the arrangements, I really appreciate it. I’m very
    honored to have the chance to be here and talk to you folks today!

    A quick content alert, before I get started. I am likely to mention some human-rights atrocities that Facebook enabled—VERY brie
    fl
    y, not graphically, certainly no images,
    but even so, I don’t want that to be a sudden and terrible shock for anyone. If you need to leave this talk because of that, I completely understand.

    View Slide

  2. And I should also mention that I am a co-investigator in the Data Doubles research project, which is investigating student attitudes toward privacy with respect to learning
    analytics and particularly library involvement in learning analytics. If you’re interested, you can
    fi
    nd out more at datadoubles dot org.

    This is not a Data Doubles presentation, however. This is just me, and nobody but me is responsible for what I say today.

    View Slide

  3. “Privacy is dead,” they say.
    So what?
    Image: Sean MacEntee, “privacy”


    https://www.flickr.com/photos/smemon/4592915995/


    CC-BY
    The saying “privacy is dead” has been attributed to Mark Zuckerberg, but I can’t
    fi
    nd any credible evidence he actually said it, so I’m not going to say he did. I think it’s
    fair to say, however, that he and the company he founded have certainly acted on this idea, including trying to make it true.

    And I hope I don’t have to explain to the librarians here today why privacy is important, but I also know that not all of you are librarians, or coming to this with the
    importance of privacy baked into your bones the way I have.

    So really brie
    fl
    y, I don’t want to spend a lot of time on this… there are lots of ways to think about and research the importance of privacy to people, and there are a
    couple-three basic things we’ve learned when we’ve done that. One is that privacy is necessary for us to learn and make progress, as individuals and as societies. As
    individuals, it’s hard to feel safe enough to make mistakes, which is CRUCIAL to learning, if there’s always somebody looking over our shoulder and maybe judging us. As
    societies, it’s hard to
    fi
    x oppression, including of us by our governments, if there’s no place people can organize and protect one another unobserved.

    Another thing is that privacy is a powerful harm-reducer. Think about the person who dislikes you most in the world, just really cannot stand you. Now imagine that
    person, whoever they are, knowing practically everything there is to know about how you live your life. Does that feel like a safe situation to you? It doesn’t to me. Privacy
    is one defense you have against that person, against all the people or groups who totally wouldn’t mind seeing harm come to us.

    As I tell my students, “information about people equals power over them.”

    View Slide

  4. SURVEILLANCE
    I’m going to be tossing around this word a lot, “surveillance,” so I want to de
    fi
    ne it. Fortunately, it’s massively easier to de
    fi
    ne than privacy is! It’s from Latin through
    French, actually, and the roots just mean “overwatching” or “overseeing.” So surveillance is just systematic observation — or, if you want to put it this way and I do,
    systematic privacy curtailment. That’s all it is.

    Okay? Okay, let’s move on.

    View Slide

  5. As you heard, I’m a librarian and an educator, I teach in the Information School down in Madison.

    Couple-three years back I built a course that’s now called Information Security and Privacy, though that name is about to change AGAIN because I’m working with Dr.
    Rahul Chatterjee in Computer Science to harmonize his infosec course with this one — he gets most of the techie stu
    ff
    , I get most of the human-factors and
    organizational-behavior stu
    ff
    . He talks about network engineering, I talk about social engineering! It’s gonna be great, I’m really excited.

    View Slide

  6. “So, what’s new


    in privacy and security
    this week?”
    Last time I got to teach this course in person — obviously it’s been a while — I kicked each class o
    ff
    with this question: So, what’s new in privacy and security this week?

    View Slide

  7. “FACEBOOK’s in trouble again.
    What is it THIS time?”
    And Facebook was just a gift to me teaching that class! Because I could always follow up whatever news stories students brought in with “Facebook’s in trouble again. What is it THIS time?” There’s. Always.
    Something.

    So, let me toss this out there to the room. Y’all follow the news, what-all IS Facebook in trouble for? Legal trouble, reputational trouble, whatever, it all counts.

    (convo starters if needed:)

    - Rohingya genocide, possibly repeating itself in Tigray, Ethiopia

    - mob attacks/murders in South Asia

    - disinfo dissemination; by Russia in the 2016 election, Trump in 2020, COVID

    - passwords in plain text; third-party data consumers leaving data on open Amazon servers

    - voter suppression by ad targeting to African-Americans

    - FTC consent decree violation

    - emotional contagion “study,” no notice or consent

    - redlining housing and job ads (even when ad purchasers try not to)

    - Cambridge Analytica: handing over data w/o user noti
    fi
    cation or consent

    - ad targeting to e.g. anti-Semites and racists

    - shadow pro
    fi
    les of non-users

    View Slide

  8. Just limiting it to privacy stu
    ff
    — which is far from the only thing Facebook gets in trouble for, of course — we could be here all day. And they’re even starting to see
    consequences, though it’s kind of a nibbled-to-death-by-ducks situation. Still, six million here, half-billion-with-a-b there (that being the total Facebook is looking at in
    Illinois), eventually it adds up to real money, even for Facebook.

    And imagine what that kind of money loss would mean to higher ed. Or whatever industry y’all work in.

    View Slide

  9. So I think it’s safe to say that none of us wants to be in Facebook’s shoes right now, or possibly indeed ever. Facebook’s shoes are not good shoes! They are bad, bad
    shoes, y’all!

    Facebook has completely and utterly SQUANDERED all the goodwill and trust it ever had, and deservedly so. They are DEEP in legal trouble, it’s only gonna get worse
    for them, and deservedly so. Does anybody trust Facebook-the-company now? Practically nobody trusts Facebook.

    View Slide

  10. Image: Renaud Camus,


    “Le Jour ni l’Heure 8497 : Paulus Potter, 1625-1654,


    Le Taureau, 1647, dét., le bouc, La Haye, Mauritshuis,


    dimanche 21 janvier 2018, 16:53:27.”


    https://www.flickr.com/photos/renaud-camus/25224890687/


    CC-BY, cropped
    But look. It’s super-easy to make Facebook a scapegoat and feel all superior to Zuckerberg and Sandberg and their gang of dupes and quislings. I mean, let’s not
    pretend I’m above cheap shots, I picked Facebook to pick on today because it’s just so easy!

    So yeah, it’s SUPER-easy to say “well, we’re not THEM, so what WE do with people’s data must be okay.”

    View Slide

  11. In my own industry, higher education, that goes something like “We’re not The Zuck, we’re not The Sandberg, we’re higher ed, we’re the good people wearing the good
    shoes! We’re here to help, we’d NEVER do any of the horrible things The Zuck and The Sandberg have been up to all this time!”

    I mean, I’m a librarian, right? We librarians are practically INDOCTRINATED into believing we’re the good people wearing the good shoes no matter what—thanks to
    Fobazi Ettarh, there’s actually a phrase for this belief in librarianship now, it’s “vocational awe.”

    View Slide

  12. Photo by (and courtesy of)


    Dr. Mar Hicks. Cropped.
    But is higher ed beyond reproach? Really? Look at this
    fl
    yer that was plastered all over the School of Engineering in Madison a couple-three years ago. Wanna be the
    next Zuckerberg? it proclaims. Sure, I’m totally down for destroying civil societies all over the world, getting millions of people killed via mobbing and disinformation, and
    bringing Jeremy Bentham’s wildest panopticonnic dreams to life! Sounds great!

    Now, did the Wisconsin Institute for Discovery really mean harm by this? I think not. Even so, did they cause harm? Yeah. Legitimizing Facebook causes harm.

    So what I want to happen today, is each and every one of us holding in our souls, sitting with the truth that SOMETIMES WE ARE NOT THE GOOD PEOPLE. I’m not,
    you’re not, we’re not. The shoes we wear are sometimes bad, folks! (Though, I’m saying, seriously y’all, my shoes are Very Good today, I knew I had to bring my best
    shoe game to this talk.)

    So what I’m doing today is using Facebook’s many and awful privacy screwups as examples of how we here in higher ed — really any of us anywhere, I know not
    everybody here works here — how we can also screw up. And then we can, I hope, avoid it. Because wow, we SO DO NOT want to step into Zuckerberg’s very bad
    shoes. Make sense? Okay.

    View Slide

  13. ENTIT
    L
    EMENT
    Maybe the biggest thing Facebook did wrong that they’re in major trouble for now was feeling a total entitlement to any data they can grab from anyone they could grab
    it on or from. The Zuck and The Sandberg never even ASK themselves if maybe, just maybe, they were not entitled to know. “We CAN collect these data, therefore it
    must be okay to!” they said.

    “Nobody even knows we’re collecting these data, so who’s to object?” says the Facebook terms-of-service agreement, merely by being
    fi
    fty gazillion pages of dense
    legalese.

    “It’s our business model, so that automatically makes it okay!” Gotta pay back those venture capitalists, that’s clearly the most important thing ever. “We only collect user
    data to improve our service!” Like, did these people EVER believe the stu
    ff
    coming out of their mouths?

    View Slide

  14. It so happens that I deleted Facebook before deleting Facebook was cool. No, I really did, it was so long ago I can’t even remember exactly when I did it, but it was
    around the time the Facebook Beacon thing broke. I was like, it’s none of Facebook’s business what I buy, I’m done with Facebook.

    Yet you know something? Facebook still knows stu
    ff
    about me. It compiles what are called “shadow pro
    fi
    les” on non-users, through web bugs and buying data o
    ff
    data
    brokers. Soooooo… I left Facebook to stop them messing with my privacy, and they still think they’re entitled to do it? And to use that data however they feel like? And I
    can’t do anything about it because I’m not a Facebook user? What even IS that, other than Facebook seriously, seriously overstepping?

    View Slide

  15. But Facebook still feels entitled to track me, no matter what I want, so Facebook tosses out this incredible constant dumpster
    fi
    re of ba
    ffl
    egab self-justi
    fi
    cation. And what
    little tech-press reporting comes from inside Facebook indicates that it’s semi-cultish all up in there, if you critique anything at Facebook in even the tiniest way you’re out
    the dang door.

    But does that self-justifying, critique-not-allowed atmosphere sound familiar to any of y’all? Have you heard similar streams of empty rationalizations from people in Big
    Data spaces? Because I kinda have, including in higher ed, and even inside my own profession of librarianship, which is breaking my heart, this is not supposed to be
    what we’re about! And I just, I NEED all of us to be constantly questioning our entitlement to collect, store, analyze, share, and sell data about people.

    View Slide

  16. Let’s talk admissions for a moment. This is high-stakes stu
    ff
    in higher ed for a lot of reasons I don’t have time to go into. For now, let’s just focus on colleges and
    universities using web surveillance to gather data on applicants and potential applicants. They’re using pretty much the same privacy-invading techniques as Facebook,
    y’all! And buying data o
    ff
    data brokers to boot. If we’re the good people with the good shoes who think privacy is kind of cool — we maybe shouldn’t be doing this.

    And it gets worse. If you read these and other discussions of Big Data and surveillance in admissions, you
    fi
    nd out that if someone is actively trying not to be tracked, like
    an applicant has Privacy Badger or uBlock Origin installed in their browser, or they’re on an iPhone, the school assumes that applicant is not interested! It doesn’t point
    fi
    nancial aid at them, it doesn’t court them in any way at all. And let’s get this straight: that is real harm to them, caused just because they want to explore on their own
    without being surveilled. I’m sorry, but explain this to me, how in the WORLD is penalizing people for protecting their own privacy okay?

    View Slide

  17. Self-questioning
    Self-questioning. Questioning our own entitlement to data. That’s how we wear the good shoes.

    Sometimes we are NOT ENTITLED to do what we’re thinking about doing, or to know what we want to know. Sometimes stu
    ff
    is just plain none of our dang business. I
    hope that’s obvious, but… it doesn’t seem to be, always.

    View Slide

  18. SURVEILLANCE
    CREEP
    Unexamined entitlement to collect, analyze, and share data is part of how we get what social scientists and historians call surveillance creep—the reuse and
    augmentation of existing data for new and often nefarious purposes that weren’t originally planned for or even imagined.

    And social scientists say ruefully that surveillance creep is hard, if not impossible, to stop. In other words, you MUST expect any data you collect and store, or that is
    collected or stored ABOUT you, to be used for purposes you didn’t intend—and quite likely wouldn’t approve of.

    View Slide

  19. After all, this is kind of how we got to Cambridge Analytica, right? Supposedly they and Facebook were collecting silly quiz data because silly quiz data, or maybe
    because advertising.

    Ha ha, joke’s on us, Cambridge Analytica was trying to use the data to throw elections! Whether they were successful or not, and that one’s debatable, just the idea that
    using people’s data to manipulate them at scale is not only possible, but in fact COMMON—this article here is about marketers doing it—this should maybe give us
    pause about collecting and using data.

    View Slide

  20. But we’re higher ed, we’ve got the good shoes on, we’d NEVER — yeah, okay, I don’t think anybody’s surprised by this one. Cameras cameras everywhere and no
    apparent limitations to what the footage gets used for.

    Here’s a story I can tell on Madison — our department library put in electronic swipe-card access so folks in the department can get in after hours. Being the obnoxious
    Elephant’s Child I am, I asked our librarian where the data goes. Anybody wanna guess? Throw me a guess.

    Yeah, swipecard data from every swipe-carded space on campus goes straight to campus police. What do they use it for? They… don’t really say much about that.

    View Slide

  21. Data minimization
    And what I’m saying is, if we’re serious about wearing the good shoes, the expectation of surveillance creep needs to condition how much data we collect and keep, and
    how long we keep it. In privacy circles that’s called “data minimization,” and it ought to be central to how we deal with data. Data we don’t have can’t be breached or
    leaked or abused — including by us.

    View Slide

  22. Or we risk the people we serve expressing very strong opinions about the badness of our shoes, as happened at UCLA last year.

    View Slide

  23. CAREL
    ESS SHARING,
    INSECURITY
    Another bad thing that the entitlement fuels is sheer carelessness with data. Sure, let’s collect every piece of data we can imagine, throw it all in a single big bucket that’s
    a Big Red Target for every hacker there is, and spray data indiscriminately at so-called partners, what could possibly go wrong?

    View Slide

  24. And we’d be here all day if I seriously got into talking about the way Facebook sprays data carelessly everywhere and has gotten caught with absolutely laughable
    security practices. So I’ll just mention one outrage of many—lots of Facebook’s developer partners are as trustworthy as the average weasel—and now I’ll move on.

    View Slide

  25. “But we took out the personally-identi
    fi
    able information!” says every sketchy surveillance out
    fi
    t everywhere. Look. Y’all. The more data that’s collected on people, the less
    meaning P-I-I even has. With enough information about us, we’re all identi
    fi
    able, it doesn’t even matter if our names and social-security numbers get taken out. And in
    Facebook’s world of inescapable surveillance, that information absolutely exists.

    And I want y’all to notice that I am intentionally combining Facebook with academic research in these headlines here. Researchers have Facebook data! And we know
    that people can be picked out of the social media data researchers have! The good shoes, academia is not always wearing them.

    So careless sharing and imperfect security—and no security is perfect, that’s Information Security 101—these are real dangers to all of us.

    View Slide

  26. Anybody here from Minnesota today? (IF YES: Oh, you are? *FLINCH EXAGGERATEDLY AWAY*) (IF NO: “Oh good. I’m relieved actually.”)

    I am not welcome in Minnesota library circles. PERSONA NON GRATA, that’s me. And that’s because I gave a conference keynote a few years back where I was pretty
    harsh on poor security and privacy practices I found among Minnesota libraries. Y’all, I don’t do this public-speaking thing for popularity, okay?

    And one of the practices I called out was this study at the top from the University of Minnesota that just mashed up tons and tons of fully-identi
    fi
    ed data about students,
    without telling them, without asking their permission… and the way the data are presented in this article and others that have been published out of the data, I’m telling
    you, I could spend
    fi
    fteen minutes on LinkedIn and reidentify some individual students in the study. And I might then be able to learn things about their college career out
    of the published articles that I’m guessing they don’t want me knowing.

    That’s just carelessness on the part of those University of Minnesota authors. And it’s not okay at any level. I don’t know who should have caught this, the IRB or the
    University Librarian or the journal editors or reviewers or whoever, but SOMEBODY should have.

    And this isn’t even uncommon, my fellow librarians, don’t any of us be feeling smug about our shoes here! My research colleague Kristin Briney did a review of data
    management practices in library studies like this, and it’s a DISASTER, people. Carelessness is RAMPANT.

    View Slide

  27. Responsible data governance
    If we want to be good people wearing good shoes, we have to govern our collection and use of data better than this. We have to.

    View Slide

  28. SECRECY AND L
    IES
    Secrecy and lies. Wow, y’all. The Zuck and The Sandberg will do just about ANYTHING to keep pretending they’re entitled to spy on us, so they regularly as a matter of
    course keep secrets and tell lies about what they’re actually doing. As transparent as lead.

    View Slide

  29. Gizmodo, 26 September 2018


    https://gizmodo.com/facebook-is-giving-advertisers-access-
    to-your-shadow-co-1828476051


    Fair use asserted
    Kashmir Hill, who is a terri
    fi
    c journalist on the privacy and security beat, asked Facebook if they were letting advertisers get at, like, cell phone numbers and email
    addresses of people who didn’t even give Facebook that information, much less permission to use it. And Facebook said “nope, not us, we’d never”—and then Hill and
    some researchers proved it, whereupon Facebook
    fi
    nally, grudgingly admitted it.

    Let me say this again: FACEBOOK LIED TO A JOURNALIST ABOUT USER PRIVACY. And that was AFTER trying to keep the whole mess secret. And if you follow news
    coverage of Facebook at all, you know that this is a PATTERN with them!

    View Slide

  30. These are just two recent stories about Facebook getting caught in straight-up whoppers about how its rules are enforced and how it allows independent researchers to
    study the e
    ff
    ects it has on people and societies.

    View Slide

  31. But we’re higher ed and we wouldn’t keep secrets or tell lies, right? A couple of years ago, this story popped up out of British Columbia. A student at UBC, which uses
    the learning-management system Canvas that we also use down in Madison, this student made a sunshine-law request to know what data Canvas had on him. And the
    school’s
    fi
    rst decision was to yell NO, YOU CAN’T KNOW THAT.

    Y’all. Really? How very Facebook of them. It’s a perfectly reasonable question. In the spirit of respecting student inquiry if nothing else, UBC shouldn’t have needed a
    sunshine-law nudge. They should have been forthcoming in the
    fi
    rst place!

    View Slide

  32. So, this is about sixteen long stories, but the short version is, I obtained my own library circulation records — the stu
    ff
    I checked out, the journal articles I looked at online
    — I got the list from the University of Wisconsin with a sunshine-law request. Turns out that the U-Dub knows what I’ve been checking out for nearly the last TWENTY
    YEARS.

    How did I even know to put in a public-records request? Students in the infosec-and-privacy course I teach dug up the records schedule for this class of data, which I
    admit I had never seen or thought to look up, and it completely freaked me out.

    So… anybody here who IS NOT a records manager read University of Wisconsin records schedules for fun? Would you even know where to FIND one?

    Yeah. In e
    ff
    ect, the University of Wisconsin Libraries kept this secret from me. And I’m a little upset about that, you may have guessed. I do not think this is okay. Libraries
    are not supposed to act like Facebook!

    View Slide

  33. Transparency
    Heeding our inner concerns
    Keeping secrets and telling lies is not how The Good People Wearing The Good Shoes act. These people value transparency.

    And I want us to pay really close attention to any feelings we have of “oh gosh, we can’t tell the people we’re collecting data on, they’d be furious!” or “oh gosh, we can’t
    tell them, they Just Wouldn’t Understand!” Because those feelings, those inner concerns, are important. That’s our backbrain telling us that we might be doing something
    bad and then trying to hide that from the people we’re doing the bad thing to.

    View Slide

  34. MANUFACTURING
    CONSENT
    And then there’s the question of not just whether we were informed about all this surveillance, but whether we consented to it. And so that you know, law scholars and
    information scientists pretty much say that the usual clickthrough agreements used to get something vaguely resembling consent are broken. They don’t meaningfully
    inform anybody, nobody reads them because who has time and they’re written for lawyers, not us, and, just—let’s not pretend, those things are not consent in any
    meaningful way.

    So it’s not okay to hide behind them, to think that any old kind of data collection or analysis or sharing is okay because the data subjects supposedly consented with a
    click or two, or even a signature. They didn’t. Not really.

    View Slide

  35. Facebook, of course, takes lack of meaningful consent to the usual extremes. There was the “emotional contagion” study some years ago, where Facebook manipulated
    people’s newsfeeds to see if it could upset them. Consent, what consent? More recently, Facebook got caught paying teenagers—an economically and intellectually
    vulnerable population—to install spyware on their phones. Ain’t that great.

    View Slide

  36. So yeah, my colleagues in higher ed who are into good shoes, let’s maybe remember that a lot of our students are teenagers too. Not to mention economically and
    intellectually vulnerable. If it ain’t okay for Facebook to manufacture their consent, it ain’t okay for us either.

    Going back to UBC, the consent process for Canvas data collection amounted to a legalese waiver signed when students got their campus ID. Like, in what universe can
    somebody realistically say “no, I do not consent” at that point? This is forced consent, which isn’t consent at all. It’s also not informed consent, as the student, Bryan
    Short, notes at the top here.

    And then Bryan tried to opt out of using Canvas at all to keep it from tracking him, and y’all can probably guess how THAT went, right? There just wasn’t a reasonable,
    usable alternative for him. So why even bother asking him to consent, when he can’t realistically say no?

    View Slide

  37. Forced consent
    IS NOT


    GENUINE


    CONSENT.
    Consent by


    not clearly informing
    Consent with


    no real alternative
    Consent-by-legalese
    So what we see a lot at Facebook, even in higher ed — honestly, almost everywhere these days! — is ramming down people’s throats something that looks like consent
    but actually isn’t.

    Now, Facebook’s gonna Facebook, but here in higher ed, we’re a trust enterprise. Students have to trust us or we can’t do our jobs. We actually need genuine consent.

    *read slide*

    View Slide

  38. Refusable, revocable, genuine consent
    The good people wearing the good shoes seek genuine consent to data collection, use, and disclosure, and they make sure that people can say “no” whenever they
    need to and for any reason, or indeed for no reason at all.

    View Slide

  39. EXPLOITING
    POWER
    ASYMME
    T
    RIES
    Another way to think about consent manufacturing is as exploiting asymmetries of power. Those of y’all who have been to college, think back to how YOU felt as a
    brand-new undergrad. Overwhelmed, small, and scared, if you were anything like me. So somebody from the institution, automatically an authority
    fi
    gure, this person
    comes up to you waving
    fi
    fty pages of legalese that amount to “we wanna watch you like a bug under a microscope, is that okay?” Of course you’re not gonna say no,
    much less “ew, gross, stay out of my life!” You’re feeling overwhelmed, small, and scared and here’s this person with power who wants something from you!

    It’s not okay to do this and call it consent. It’s legal, yeah, but it’s not okay! It’s a total trust-destroyer!

    View Slide

  40. This story, it cracked me up, the original headline was “Facebook’s Privacy Policies are an Unreadable Mess.” So yeah. Bad shoes here, Facebook using its considerable
    power to exploit us and steal our privacy out from under us.

    View Slide

  41. Back to my circulation records, okay? Here’s where power comes into it. The University of Wisconsin Libraries have no privacy policy. None. Nowhere except for those
    buried records schedules — nowhere do they disclose they’re doing this. Nor is there any way at all for me to tell them to butt out of my reading habits. That’s such a
    power trip I can’t even TELL you.

    And I’m an employee and a librarian myself… not a student. It’s even harder for students to try to stand up to power here.

    View Slide

  42. Policy… with teeth


    Limit power trips
    The good people wearing the good shoes not only have privacy policies, they enforce them on themselves. And they don’t power-trip on everybody in sight. “Because
    we can!” is not an ethical justi
    fi
    cation for anything EVER.

    View Slide

  43. REINFORCING
    BIAS AND


    OPPRESSION
    I’ve talked mostly about surveillance so far, but I do also want to talk about one side e
    ff
    ect of analyzing the data gained through surveillance: reinforcement of existing
    bias and oppression.

    View Slide

  44. Some researchers found out that even if Facebook advertisers WANT to cast a wide, unbiased net for who sees their ad, Facebook’s recommender system discriminates
    by race and gender anyway. Facebook’s ad recommender cannot
    fi
    gure out how NOT to be racist and sexist.

    But for once, this is not an outlier result, this is not Facebook going above and beyond to be horrible. Facebook’s right in line with other Big Data and machine learning
    and arti
    fi
    cial-intelligence and predictive analytics projects here.

    One reason this happens is that because we live in a biased society, Big Data from and about us intrinsically carries bias. Biased datasets are used to train recommender
    systems, which produce, surprise! biased results. Over and over and OVER AGAIN we have seen this with search engines and recommender systems and predictive
    analytics and other surveillance-fueled tools, yet STILL we implement and plan to rely on them.

    There’s a saying about doing the same thing over and over again while expecting di
    ff
    erent results…?

    View Slide

  45. One place that recommender systems are coming up in higher ed is with advising. Let’s let recommender systems route students to courses, let’s let them route students to majors. And we can do it better if we throw lots of data
    about students, including surveillance data, at this problem, right? Right?

    Yeah, no. I teach this other course called Code and Power that discusses the demographics of IT industries and IT education. I know EXACTLY what’ll happen if course recommender systems use demographic data to point students
    at courses and majors. Even fewer people who aren’t white or Asian men will be pointed to most science and engineering courses and majors, that’s what’ll happen, because that’s what we have NOW and the recommender system
    will see that pattern and reinforce it. That’s what recommender systems are designed to do! See patterns and reinforce them! They don’t know the di
    ff
    erence between a good pattern and a pattern of bias!

    And please don’t tell me that you’ll just leave out demographic info and it’ll be
    fi
    ne. It will not be
    fi
    ne, because patterns of racism and sexism in science and engineering are easy for algorithms to spot even if you take the speci
    fi
    c race
    and gender variables out. Key phrase is “proxy variable,” look it up, and understand that the data these advising systems rely on will be full of proxy variables for gender and race.

    Look, computers do not have ethics. They don’t have cultural competency. They don’t even have the consciousness of human society and its issues that would enable them to develop ethics and cultural competency. This makes it a
    pretty bad idea to rely on them in situations that require ethics and cultural competency.

    Such as advising.

    View Slide

  46. Never try to make data
    do a human’s job.
    The good people wearing the good shoes never try to make data do a human being’s job. And care, compassion, and nuance are human jobs. Machines can’t do them.

    View Slide

  47. INDIFFERENCE
    T
    O HARM
    The utter indi
    ff
    erence within Facebook, by Facebook, to the incredible amount of harm that Facebook has done, to individual people and to the world, I can’t get over
    this. It’s so stark, and so awful.

    View Slide

  48. SERIOUSLY, FACEBOOK?!
    Facebook didn’t care that it was letting antisemitic and white-supremacist organizations target ads to more of the same. They DID NOT CARE that their platforms were
    being used to incite people to attack and murder other people, from individual murders all the way on up to actual genocide. ACTUAL. GENOCIDE. They don’t care now
    that the COVID misinformation they’re letting people peddle is helping people die. They don’t care!

    That sweet, sweet ad money their surveillance and laissez-faire gets them, that’s all Facebook cared about. Until it became an image problem for them. And even now,
    they care about their image, not the harm. If you watch what they say as opposed to what they actually do, they’re trying to de
    fl
    ect responsibility, not cut down on the
    horri
    fi
    c crimes they’ve enabled.

    View Slide

  49. This came out just the other day, Facebook knows that Instagram worsens body-image issues among a huge number of its teenage-girl users. And as a former teenage
    girl and present-day fat woman, let me just say, this isn’t harmless stu
    ff
    , eating disorders are real and they kill people. But of course Facebook didn’t disclose this until
    forced to, and what are they actually gonna do about it? You tell me.

    View Slide

  50. But we’re higher ed, right? The Good People Wearing The Good Shoes? How bad could it really get here?

    Well, let me toss that out there — any surveillance or data practices y’all have seen over the last couple years that you wish we’d all rethink?

    View Slide

  51. Here’s one that’s on my bad-shoes list: automated exam proctoring. There is nothing defensible about this. It discriminates against folks with various kinds of disability. It
    discriminates against darker-skinned folks. It’s wildly, massively invasive. It creates stress to the point of actual trauma.

    I just. Why. Why did any of us anywhere think this level of privacy invasion was okay. I hope every single lawsuit about exam proctoring goes against higher ed. We
    deserve to have our shoes shredded over this.

    View Slide

  52. CREDUL
    ITY


    AND “NEUT
    RAL
    ITY”
    Which leads me to one last lick I want to get in before I suggest a more holistic way of thinking about how to
    fi
    x all this.

    Why did we think automated exam proctoring would work? Why do we think automated advising works? Why do we think surveillance-compiled data tells us anything
    we couldn’t
    fi
    nd out by, I don’t know, talking to people? Why do we think machines are the Delphic Oracle, able to predict anything and everything accurately? Why?

    And why do we think the use of machines is somehow a Star Trek Neutral Zone, free of questions of ethics and values and morals? Follow somebody on the street with
    your human body, get arrested for stalking — follow them through their phone and you’re a venture-capitalist unicorn! Why, people? Why?

    View Slide

  53. There’s been a hilarious undercurrent in Facebook trade press coverage the last couple years, folks saying louder than a whisper that Facebook’s much-hyped ad
    targeting, aside from its vile side e
    ff
    ects and externalities, isn’t actually very good, or indeed good at all. Advertisers can get similar results from ads that aren’t targeted,
    they’re saying.

    And I’m like, why don’t we actually know this? Why hasn’t it been tested? And of course the obvious answer is Facebook doesn’t want anybody looking too closely at its
    shoes, which we know are bad shoes — but given those incentives, why does anybody believe Facebook’s hype? Why?

    View Slide

  54. So, how are our shoes looking here in higher ed? We’re a research enterprise, we’re seekers of truth, do WE believe all the analytics and surveillance hype? If we do, why
    do we?

    Let me be clear, our belief status has NOTHING to do with rigorous research into and testing of all this muck on our shoes. There mostly hasn’t been rigorous research,
    and what little actually has trickled out of the hype-riddled AI launches is mostly a bust.

    This stu
    ff
    doesn’t work. IT. DOES. NOT. WORK. So let me get this straight, we’re using our students as guinea pigs for venture capitalists, invading their lives to do it,
    without asking them of course… and after all that, pretty much none of it works?

    Wow, everybody. Just wow.

    View Slide

  55. Insist on independent research
    Be skeptical of hype
    The good people wearing the good shoes insist on independent, rigorous, well-conducted e
    ffi
    cacy research for any technology they implement, especially when they’re
    potentially or actually invasive of privacy. The good people wearing the good shoes are skeptical of hype, including every single word ever published in Educause Review.
    Again, I feel that this should be obvious, yet it seems not to be.

    View Slide

  56. Photo: Zeev Barkan, “Fine art vs. documentary photographs”


    https://www.flickr.com/photos/zeevveez/7095563439/


    CC-BY, cropped
    So. Ending the rant now. If you’re relieved… honestly, so am I. What we’re left with is, what the heck do we DO? How do we demonstrate—not just say, but
    DEMONSTRATE—that we care more than Facebook does about the harms that surveillance can do? How should we bake our care into our policies? our procedures?
    our communication?

    Because just saying “We take your privacy and security very seriously” doesn’t cut it any more, if it ever did. How many times has Facebook said that? Who believes it
    any more? From anyone?

    I hate to say it, but there isn’t as much good guidance out there as I wish there were. A lot of our existing ethics infrastructure, like IRBs, isn’t really set up to handle this
    new reality either. I mean,
    fi
    nd me and ask me, I teach this stu
    ff
    , I can often point you to what little there is. But if you’re going to be in this space, you HAVE to lead on
    ethical issues.

    I think asking questions is a big part of how we approach this, especially for folks who aren’t running the show where they are. Ask questions, when you’re feeling some
    Big Data surveillance coming on. “Are we actually entitled to collect or use this data? When should we delete it? How are we going to tell students this is happening?
    Isn’t this surveillance creep? What harm could come to our students from this? How do we not be Facebook?” We ask, and we keep asking.

    View Slide

  57. This is a hot-o
    ff
    -the-presses piece by a Colorado librarian named Shea Swauger. Shea goes for an abolition metaphor here, quite consciously and intentionally, and it’s
    powerful. I recommend the piece unreservedly.

    But I’m choosing to suggest a slightly di
    ff
    erent metaphor today.

    View Slide

  58. Also hot o
    ff
    the presses, Harvard’s massive endowment fund is
    fi
    nally divesting from environment-destroying fossil fuels, after years of pressure from students and
    others.

    View Slide

  59. And we actually got close to this here in Wisconsin, it turns out! The University of Wisconsin’s own endowment fund has been pressured to divest too. This particular
    attempt didn’t succeed — but only because of a weird loophole, and it sure sounds like the adjudicators here wish that loophole didn’t exist.

    Sometimes that’s how progress happens. Slowly, painfully, and with setbacks.

    But I want to grab onto this idea of “divestment.” Partly, admittedly, because it’s familiar to us in higher ed, and I think that helps.

    View Slide

  60. Photo: Guido van Nispen, “L1006791-Edit”


    https://www.flickr.com/photos/vannispen/34489924632/ CC-BY, cropped
    But mostly because I think divestment is a powerful idea in this context. What would divesting from Facebook look like? What if we bought no advertising from it,
    minimized or even removed our organizational presence there? And why stop at Facebook? Let’s divest from Google Analytics! From Google altogether! (She says, as
    this broadcasts on Google-owned YouTube.) What if we divested from the surveillance that’s in our learning-management systems and our online journals and our e-
    textbook platforms?

    The parallels between higher education divesting itself of environmentally corrosive fossil fuels and divesting itself of socially corrosive surveillance are… actually kind of
    striking, I think?

    For one thing, we can’t divest without admitting where we INvested in the
    fi
    rst place, and that it might have seemed like a good idea at the time, but turned out not to be.
    Admitting mistakes is never easy and never fun, but as I suggested earlier, I think that style of self-re
    fl
    ection is healthy and wise.

    For another — look, we in higher ed like to think of ourselves as leaders and shapers of society. We serious about that? Then it’s time to divest, I think.

    View Slide

  61. DIVEST FROM
    So that’s the provocation I’ll leave you with today. The good people with the good shoes, in higher ed and outside of it, divest from Facebook.

    View Slide

  62. Thank you!


    Copyright 2021 by Dorothea Salo.


    This presentation is available


    under a Creative Commons Attribution
    4.0 International license.


    Please respect licenses on included photos.
    All clip art from openclipart.org.
    Because it’s not too late! I wouldn’t be standing here if I thought it was too late. We CAN avoid becoming the dumpster
    fi
    re that is Facebook. It will take wisdom and it will
    take leadership. I hope all y’all will help provide those.

    Thank you.

    View Slide