Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Ethics smells: when it looks ethical but something smells off...

Ethics smells: when it looks ethical but something smells off...

For the one-credit topics course "[Big] Data Ethics," taught at the University of Wisconsin-Madison Information School, summer 2019. Has been edited for clarity and to add new ethics smells since.

Dorothea Salo

June 12, 2019
Tweet

More Decks by Dorothea Salo

Other Decks in Education

Transcript

  1. “ETHICS SMELLS”
    when it looks ethical, but something smells off…


    Dorothea Salo

    View Slide

  2. PROGRAMMERS AND “CODE SMELLS”
    •Sometimes a given chunk of programming code works, but
    appears highly likely to cause a problem down the road.


    •Missing edge cases


    •Lots of cut-and-pasted code (unmaintainable after a certain point)


    •Cuts corners on security


    •Etc etc etc—ask a dozen programmers, get fifty problems cited


    •These hints at poor/unstable/unmaintainable/hackable code
    are called “code smells.”


    •I ruthlessly appropriated this idea into “ethics smells.”

    View Slide

  3. WHAT’S AN ETHICS SMELL?
    •A trope, rhetorical flourish, or line of reasoning that may look
    reasonable at first glance, but hints at ethical flaws.


    •Not everyone guilty of an ethics smell is a Bad Person doing Bad
    Things. It may not be that simple.


    •Sometimes ethics smells become part of prevailing discourse! Such that
    omitting or questioning them feels weird!


    •Sometimes people just haven’t done the work yet, or are seduced by
    whatever the latest shiny thing is.


    •Sometimes the ethical issues are legitimately hard to understand!


    •But yes, sometimes people do use these tropes etc. to forestall
    or deflect legitimate ethical critique. That’s not okay.

    View Slide

  4. EMPTY PLATITUDES
    •E.g. “Your privacy and security are very important to us.”


    •Who even believes this now? We’re all like “prove it,” for good reason.


    •How can you tell a pleasant-seeming ethics platitude is empty?


    •Is any information given about actual actions in support of the ethics? If
    not, be suspicious.


    •If what they’re saying about the ethics issue confuses you… that may be
    intentional. Be suspicious… especially where they’re using way more
    words, or way more jargon, than seems warranted.


    •When in doubt, check into the person or organization’s track record. Is
    there reason to believe they will act (un)ethically? Are they self-serving?

    View Slide

  5. ONE DUMPSTER FIRE OF
    BEWILDERING BAFFLEGAB LATER…
    Higher-ed’s


    Surveillance


    Central!

    View Slide

  6. “WE’RE GOOD PEOPLE! IT’LL BE FINE!


    IT’S FINE BECAUSE IT’S US!”
    •Good is not a thing you can be. It’s only a thing you can DO.


    •When you stop doing good—especially when you start doing bad
    —you fall right off the high road.


    •This one is psychologically understandable; we all desperately
    want to believe we’re good people.


    •Ironically, insisting that we are good people no matter what we
    DO raises the chances we stop governing ourselves aright.


    •LIBRARIANS, WE ARE PRONE TO THIS ONE.


    •“To serve our patrons!” can become the door to evil. I’ve seen it.

    View Slide

  7. EXACT WORDS
    •(see TVTropes.org on this, also “Insistent Terminology”)


    •Often used to justify ethically dubious behavior because fifty
    pages of legalese supposedly spelled it out, and/or the victims
    “consented” to it.


    •“if you read the Exact Words of our novel-length privacy policy…”


    •Sometimes a distraction tactic; see e.g. Facebook: “the
    Cambridge Analytica thing wasn’t actually a breach!”


    •Technically true—Cambridge Analytica was allowed access to the data
    they abused by Facebook itself; nobody’s security was broken—but so
    what?


    •How is “Facebook allowed CA to use Facebook data to try to throw
    elections!” supposed to be an ethical improvement here?

    View Slide

  8. “BUT MY THING IS NEUTRAL!”
    •Librarians: we are SO GUILTY of this one. So very, very guilty.


    •It’s also rife in computer science. I don’t entirely know why (and I doubt
    it’s a legacy from LIS), but I know from experience and reading that it’s so.


    •Often the subtext here is “how dare you judge me?” or “I shouldn’t
    have to think about the ethics of what I do!”


    •Often an attempt to evade ethical questions around power and
    oppression.


    •If it’s “neutral,” it can’t be racist or sexist or ableist or ageist or
    homophobic or transphobic OR GENOCIDAL (Facebook!!!), can it?!


    •Decades of STS, LIS, ethics work giving the lie to “neutrality.” I
    don’t have time to recount it all (ask me if you want readings).

    View Slide

  9. THE “NEUTRAL PLATFORM”
    Story from The Atlantic.

    View Slide

  10. OWN THE GOOD, IGNORE THE BAD
    •aka “because it helps [some] people, it must be in the clear ethically!”


    •This can be nakedly self-serving, of course.


    •And it fails consequentialist ethics forever!


    •Can also be power and privilege: “this thing helps some people
    (often ‘people just like me’), so all the people it hurts don’t matter!”


    •Sometimes it’s naïve optimism at work.


    •You may be able to tell the difference based on how long the responsible
    entity has been up to whatever the thing is. Are they still under the spell of
    the shiny? Or should they really know better by now?


    •I say this because you’ll want to tailor your counters.


    •Take the self-serving and privilege-leveraging down. No mercy.


    •Ask the optimists questions about how they’re handling their thing’s ethical
    issues. They don’t have to be gentle questions, necessarily!

    View Slide

  11. THIS GUY SHOULD KNOW BETTER.
    Story: Misha Angrist, “Do You Belong To You?” 2 January 2018.


    http://genomemag.com/do-you-belong-to-you/


    Fair use asserted.

    View Slide

  12. BUT WHAT DO WE DO WITH THIS?

    View Slide

  13. VARIANTS ON “IGNORE THE BAD”
    •FOMOngering


    •“Everybody’s doing this!” (I have yet to see this be true when said
    aloud. It is invariably bullying or rationalization, not fact.)


    •“If you’re not doing this, you’re WrongBadBehind and you’ll die!
    Unethical-Thing is your only hope!”


    •Risks, what risks? Ethics, what ethics?


    •Definitely look for a self-serving (or at least self-justifying) motive here.


    •“We have an obligation to do this! How can we not?”


    •Often accompanied by a heartstring-tugging case study


    •Conspicuous by its absence: any discussion of risks or harms, except
    possibly for an Empty Platitude or two


    •(This is so common in learning analytics. So very, very common.)

    View Slide

  14. View Slide

  15. VARIANT: SCAREMONGERING
    •“If you don’t do this dubiously-ethical thing, All The Bad Stuff
    will happen and it’ll be YOUR FAULT!”


    •So nakedly self-serving that you’d think people would see
    through it instantly… but if the Bad Stuff is bad enough and
    people are desperate enough to avoid it, we land here.


    •Sometimes the desire to Do Something, or Be Seen To Be Doing
    Something, overcomes common sense.


    •(Comes up frequently to justify surveillance because of crime-
    avoidance/safety.)

    View Slide

  16. View Slide

  17. “WE CAN’T TELL THEM WHAT WE DO!


    THEY JUST WOULDN’T UNDERSTAND!”
    •This is invariably code for “we’re doing a Bad Thing and we don’t
    want people to know about it because they would absolutely
    object to it.”


    •I don’t love notice-and-consent. I will never love notice-and-
    consent. (See “Exact Words” slide for why.)


    •But keeping dodgy practices secret is even worse.

    View Slide

  18. View Slide

  19. “THOSE PESKY ETHICS, ALWAYS
    GETTING IN THE WAY OF PROGRESS”
    •I don’t think I’ve ever seen this be so much as well-intended,
    never mind ethics-minded. It exists to delegitimize ethical
    concerns and those who hold them.


    •Seeing it? Question “progress,” loudly and often.


    •“Progress” is not always the word that will be used, of course;
    “innovation” is another common culprit.


    •Medicine: Primum non nocere = first, do no harm.


    •Not the same as a conflicting-ethics dilemma—that’s normal, to
    be expected, and absolutely a legitimate thing to think through.

    View Slide

  20. WHEN PEOPLE TELL YOU WHO THEY
    ARE, BELIEVE THEM!
    Oakleaf, Megan. 2018. “Library integration in institutional learning analytics.” https://
    library.educause.edu/-/media/files/library/2018/11/liila.pdf

    View Slide

  21. VARIANT: “ETHICS ARE OUT OF SCOPE.”
    •Technology standardistas and startups are fond of this one.


    •Do not even get me started about the NISO-sponsored library-
    technology standard RA21/SeamlessAccess. We will be here for weeks.


    •I got into such an argument on Twitter once about whether OAIS
    should require digital archives to pay attention to risks to individuals or
    groups represented in a dataset, equivalently to “designated
    community” of data end-users…


    •The fight ended with an OAIS booster declaring repeatedly that this
    question was out of scope for OAIS. If so, then OAIS needs to fix its
    scope, in my not-entirely-humble but trying-to-be-ethical opinion.


    •The IETF finally, finally, FINALLY repudiated this in 2019: https://
    datatracker.ietf.org/doc/draft-iab-for-the-users/


    •“The IAB encourages the IETF to explicitly consider user impacts and
    points of view in any IETF work.” Whoa. That’s a change for the better.

    View Slide

  22. “THIS PROBLEMATIC THING


    IS A FACT OF LIFE NOW.”
    •Zuckerberg is full of these. “Only connect.”


    •Ethics rule of thumb: If you start to sound like The Zuck, stop.


    •So many Big Data news stories and explainers start with this!


    •So much that I think it’s now one of those part-of-the-discourse things…


    •Sometimes, as with The Zuck, this is a naked attempt to set the
    terms of discourse to avoid ethical responsibility.


    •In my experience, though, this one is especially likely to occur in
    the writing and work of people who are basically decent.


    •If that’s you, my advice is, do not cede ground without a fight.


    •Is the problematic thing really a fact of life? Is it truly unsubstitutable,
    unstoppable? Or is it just convenient for some people if we all think so?

    View Slide

  23. “NOBODY TOLD ME I SHOULDN’T!”
    •the flip side of “I was ordered to” perhaps? Since Nuremberg at least,
    we’ve known that one’s out of bounds!


    •see also TVTropes “Ain’t No Rule” (often “Ain’t No Law/Regulation”)


    •So ethically lazy I can’t even. OWN YOUR ACTIONS. But I’ve seen it.


    •Assumes law/ethics systems are perfect and omniscient, such that
    if someone were really doing something bad, they’d be stopped.


    •This is nonsense, of course. Law lags bad behavior even more than ethics!


    •Not the same as “the ethics system here has gaps/loopholes;
    here’s how I navigated them, though I may have gotten it wrong.”


    •Where you see this, it is typically a thoughtful analysis that understands
    that where ethical guidance is not clear or there is no applicable guidance,
    we have a duty to think the ethics through on our own.

    View Slide

  24. ABOUT THAT “GAYFACE” STUDY…
    From Metcalf, Jacob. 2017. “‘The study has been approved by the IRB’: Gayface AI, research
    hype and the pervasive data ethics gap.”


    https://medium.com/pervade-team/the-study-has-been-approved-by-the-irb-gayface-ai-
    research-hype-and-the-pervasive-data-ethics-ed76171b882c


    View Slide

  25. “WHO COULD HAVE KNOWN


    THIS WOULD BE BAD?”
    •Often a sign of someone who didn’t do their historical or ethical
    homework. Or said “talk to the hand!” to anyone who DID know.


    •Also common where the responsible entity is not inclusive.


    •See e.g. Google, which screwed this up repeatedly with search, image
    auto-tagging, Buzz, and Google+ (look up the “nymwars”).


    •Sometimes a sign of a new, naïvely optimistic person or field.


    •Information security labors under a lot of technology standards and
    infrastructures that were poorly designed because nobody thought
    anybody would ever hack or misuse them. (See e.g.: email, BGP.)


    •I can grudgingly forgive this… for a little while. (For Big Data… it’s
    way, way too late for any more of this.)

    View Slide

  26. VARIANT: “IT’S TOO EARLY TO TELL
    IF THIS THING I DID WILL BE BAD”
    •This variant often comes from academic researchers,
    particularly of the “tech is neutral!” variety.


    •Often code: “I don’t want to think about ethics! Don’t make me!”


    •If you see a lot of privacy experts and/or historians head-
    desking over something… look at their analogies carefully.


    •Sure, don’t accept anything you hear without checking; that’s fine.


    •But my experience, for what it’s worth, is that these folks aren’t
    alarmists, just people who have seen this before and understand the
    patterns behind it.

    View Slide

  27. Story from The Atlantic.

    View Slide

  28. ABOUT THEM, WITHOUT THEM
    •(riff on “Nihil de nobis sine nobis,” which has been around a long time)


    •“We’re going to use this questionably-ethical thing to fix all Their
    problems for Them!”


    •They, whoever They are, are either conspicuous by Their total absence, or
    “represented” by a single token person/case study.


    •They are frequently in a less-powerful, less-privileged, less-voice-y position
    than the speaker. They may in fact have no option to refuse the intervention,
    which is questionably ethical all by itself.


    •So much wrong with this. So much.


    •Othering people is deeply uncool. So is disempowering people while
    ignoring their voices.


    •“Deficit model”—They have problems (not strengths!), and We Know Better
    Than They Do how to fix Them.


    •Often operating off stereotypes and superficialities—frequently blatantly
    incorrect ones—about Them.

    View Slide

  29. VARIANT: PROBLEMATIC THING


    FOR THEE, BUT NOT FOR ME
    •Ethically-problematic thing is forced onto people with less social
    power, somehow miraculously bypassing people with more.


    •Exceptionally common with surveillance and surveillance-based
    interventions: predictive analytics in government, workplace
    surveillance, learning analytics, public video surveillance, the
    entire history of Facebook


    •Beware surveillance creep! Just because you’ve not yet been
    caught in the dragnet doesn’t mean you never will be.

    View Slide

  30. I’M SURE THERE ARE MORE!
    •These are just the ethics smells that either have been itching me
    forever, or I happened to notice recently.


    •Contribute your own on the forums, or in your weekly reflections.

    View Slide

  31. THANKS!
    This presentation copyright 2019 by Dorothea Salo.


    It is available under a Creative Commons Attribution 4.0
    International license.

    View Slide