Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Community engagement [in digital collections]

Community engagement [in digital collections]

Lecture for LIS 668 "Digital Curation and Collections."

Dorothea Salo

October 16, 2019
Tweet

More Decks by Dorothea Salo

Other Decks in Technology

Transcript

  1. “Community”? • Yeah. Talk about incredibly overloaded words. Sorry. •

    Breaking it down a bit (these categories may overlap): • communities who created materials in your digital collections • communities represented in your digital collections • communities who have an ownership claim to materials in your digital collections. N.b. this doesn’t necessarily mean a legal claim! The law can be ridiculous about this and often is. Moral claims count too! • communities you want to help you do the work as you build your and/or their digital collections • communities you want resources and support from • communities you hope will notice, use, and benefit from your digital collections (≈ OAIS “designated community”) • communities who may cause trouble over (or with) your digital collections. I’d like to ignore trolls and hackers too, but that’s not practical.
  2. Agenda • OAIS and the “designated community” • I hate

    this. Hated it throughout my career as an IR manager. Hate it SO MUCH. But you need to understand it so I need to talk about it. • Before-project engagement • Communities represented in the materials • Courting funders, partners, and helpers • During-project engagement • Subject matter experts • Volunteers, internships, crowdsourcing • Post-project engagement • Outreach • Education • Social media
  3. Thanks! Copyright 2019 by Dorothea Salo. This slidedeck available under

    a Creative Commons Attribution 4.0 International license.
  4. Quick refresher on OAIS • Originally built by NASA as

    a guide to what they needed a digital-preservation repository to do, and how they needed humans to interact with it. So far, so good. • Absolutely inexplicably became THE guide to these things. For everybody, at least in theory. • I mean it. I cannot explain this. Certainly not based on the virtues of OAIS. • Has an entire train of apologists. • “Well, if you stretch the meanings of words to infinity, OAIS still works for your use case!” (Ummmmmmm…) • “Well, it’s only a model!” (Yes. An incomplete, incorrect, and often unworkable model that despite its problems has near-infinite authority… and whose boosters haven’t fixed its problems.) • “Well, you don’t have to use it!” (Tell vendors and repository auditors that.)
  5. “Designated community” • The basic idea is: figure out who

    you expect to USE the digital materials you’re preserving — that’s your DC — and how they use them, then make sure they can use them that way (ideally without bugging you about it). • This idea has consequences for: • File formats, format migration/normalization, software dependencies. Examples: CAD blueprint files, research-data files in (e.g.) Excel. Don’t save them to PDF and call it a day! You just destroyed much of their usefulness to their DCs! But… what happens when some CAD software becomes obsolete? • Metadata. Will the DC be able to find the collection? Understand what it is and what’s in it? • Collection and item documentation. Is it appropriate to the DC? (I’ll be less vague about this elsewhere in the course, I promise.) Sufficient for the DC to use it?
  6. “DC” is often problematic here in the World Outside NASA.

    • OAIS tacitly assumes one or only a few DCs. • If you’re running an institutional repository or a multi-disciplinary, multi-institution research-data repository, this is a ludicrously false assumption — and it’s not reasonable to expect you to canvass every DC! Different DCs’ needs may even conflict sometimes! • I mean, it made sense for NASA, sure. We ain’t all NASA. • Assumes that the only communities a well-run repository needs to consider and document are its DCs. I find this assumption absolutely appalling, even dangerous. • A repository containing data about living persons that isn’t considering risks to them is utterly irresponsible. NASA doesn’t have to care because you can’t dox or arrest an asteroid, but WE AIN’T ALL NASA. • Maybe the ethics of representation and credit aren’t in-scope for OAIS… but then again, maybe they should be? (I had this fight on Twitter…)
  7. Thanks! Copyright 2019 by Dorothea Salo. This slidedeck available under

    a Creative Commons Attribution 4.0 International license.
  8. Why bother? • When do you need to know a

    project isn’t feasible? ASAP. • Wow, have I seen some grant apps containing… fanciful… assumptions. • When would you rather learn that you shouldn’t do a given project at all? Before you start — or after you’ve unveiled it? • The latter horror scenario has actually happened. The collection of On Our Backs that Tara Robertson talked about? Withdrawn. After lots of bad press. • You do want money and help, right? You can’t assume they’ll Just Happen. • Not-uncommon issue with projects dependent on crowdsourcing. If the crowd doesn’t show up… or can’t figure out how to help you… or finds the process of helping you boring, dispiriting, or too difficult… • This is not just an external communities problem, by the way! You may have to convince your colleagues and management that a project should be done.
  9. Communities who own • COPYRIGHT. Copyright copyright copyright. • We’ll

    talk more about this in the appraisal/selection module. • Repatriation of stolen/looted works. • Often THE LAW around First Nations works. If you’re in or around this area, read your NAGPRA! • “Digital repatriation” (handing back a copy of the digitized files) can sometimes be negotiated, and can be a win for all parties. • “I took that photo! That’s my grandmother! How did you get this? She wouldn’t want that digitized!” • (This actually happened.) • Sometimes it makes sense to hold firm. Sometimes… it doesn’t.
  10. Researchers as data owners • Whoof. They’re difficult. • They

    think they “own” data they (legally speaking) don’t. • They think you don’t have anything to teach them because you’re not an expert in their discipline. They’re wrong. • They don’t understand the difference between you and IT, and they may disrespect you because you’re not IT. • They think they understand digital preservation. They’re usually wrong. • Honestly? You can only do what they let you do. If they’re not really willing or able to hear or heed you, WALK AWAY. • This is actually a good scenario to ask about in an interview for a research- data job. If your managers will get upset with you for walking away… don’t walk, RUN AWAY from that job. • Which leads me to…
  11. Data Management Plans • Required by many research funders, especially

    post-2011 when the NSF started requiring them. A few funders even specify some of what has to be in DMPs. • This came as a rude shock to many researchers. Others treat it as one more hoop to jump through, don’t take it seriously, and leave it until the last minute. • Guess how well they’ll comply with a last-minute DMP. Just guess. • Your job, given a Last-Minute DMP Special: • Quickly suss out requirements relevant to the grant and its funder. (Ignore any nice-to-haves. This is about what the researcher MUST say or do.) • Supply as much pre-chewed text to the researcher as possible. • Boil down the rest into specific questions the researcher must answer.
  12. DMP machines • They exist. It makes sense to point

    researchers at them as often as possible. • I hate to say it, but this is partly an Appeal to Authority. A researcher who doesn’t respect you may respect an official-looking website. • Start with DMPtool: https://dmptool.org/ • Go to https://dmptool.org/public_templates to see if DMPtool has a template specific to your researcher’s grant or funder. • If not, the NSF-GEN template is often a good place to start. • Some universities have locally-branded DMPtools that take into account local services that fulfill funder requirements.
  13. Communities represented • If the community still exists, it’s on

    you to connect with them. I get that it can be awkward. Tough. It’s still on you. • And more optimistically — it’s not uncommon that the community has never been approached respectfully before, and will be thrilled! • Understand whether the community has reason to like or trust you. UNDERSTAND THAT IT MAY WELL NOT. Approach accordingly. • GLAMs have a long history of oppressing peoples and communities. That history doesn’t just vanish because you want to do a project. • An error I personally saw: seek a grant for a project before engaging the community represented in the materials. A grant-review committee I was on killed two grant apps for this reason alone — and I agreed with both decisions.
  14. Tips for approaching • Give the community veto power over

    the project. This is one situation in which that is wholly appropriate! • Explain who you are, what you do, and why you are invested in these materials. • Lots of communities have never met a preservation librarian or archivist before. If you don’t explain, they may jump to conclusions. • Explain possible risks (in both directions) clearly. • Risks to the community? Risks to the materials? • Ask whether and how the community would like to be involved and informed if the project goes forward. • You may have to seed this discussion by offering possibilities. • Ask how you and the project can help the community. • LISTEN. LISTEN. LISTEN.
  15. “Community archiving” • … means the community leads. Not you.

    • “Post-custodial archiving” means recognizing that it’s the community’s stuff. Not yours. • Helping a community care for its stuff does not make the stuff yours! • Much of the pushback against these ideas is coming from people for whom “The Expert” or “The Professional” is a lot of their identity. • “I know best, which means I get to tell everyone else what to do!” For some reason, this attitude is not popular with communities… • Also “The Collector.” • “Whoever dies with the most stuff wins!” 20th-century archives and academic/ research library special collections were pretty much organized around this idea! • If that’s you… I get it? But this isn’t considered appropriate now, and I need you to reframe it in your own mind.
  16. When it’s research… • … additional constraints apply to data

    representing individuals. • Many of these are codified in the “Common Rule,” which governs Institutional Review Boards (IRBs). • I know you went over this in LIS 603, so I won’t recap. • Depending on the research and the research participants, HIPAA, FERPA, and/or GDPR may be in play, as well as the Common Rule. • For us, the important thing to consider is additional risk to participants from digitizing/preserving the data. • If it’s sensitive data that’s (re)identifiable, digitization and preservation may be completely inappropriate. Making the data open almost certainly is! • N.b. this is incomplete! Harm may occur to other people and communities besides participants. Unfortunately, the Common Rule doesn’t presently recognize that… but you should.
  17. Grant funders… • … we’ll talk about during Sustainability Week.

    • But it can’t hurt to think of them as a community with its own needs. • Top need: to demonstrate that they’re investing money wisely. Particularly for federal grant agencies, your project needs to be something defensible — including to people who hate that agency and everything it stands for. • (IMLS and NEH have repeatedly been written entirely out of presidential and congressional budgets.) • Grant agencies and specific grant programs have missions. Your project needs to fit with their mission. Pick the right funder and the right program, and don’t make grant reviewers figure out the connection between your project and their mission — your application should make this crystal- clear!
  18. Thanks! Copyright 2019 by Dorothea Salo. This slidedeck available under

    a Creative Commons Attribution 4.0 International license.
  19. Participant communities • “Subject matter experts” • Often creating metadata

    for items you digitize, descriptions for the collection as a whole, ID-ing people and situations in photos, etc. • Volunteers • Sometimes doing the actual digitization • Sometimes doing metadata • Sometimes doing quality-control work • Crowdsourcing participants • Transcriptionists • Metadata-ists
  20. OAIS is useless here! • Your participants may or may

    not be from the material’s “designated communities.” If they’re not, OAIS has nothing to say about them! • If they are, OAIS only considers them as end-users, not as project participants. • Again, asteroids can’t create metadata. WE AIN’T ALL NASA, though. • Unfortunate corollary: your “OAIS-compliant” system may not be designed well for your project participants. • … frankly, you should expect this. It’s been part of plenty of project failures. • Even systems not claiming OAIS compliance fail here. • Wrong response: “well, they just gotta learn to use the system; it’s what we have.” • Kiss your project participants goodbye with this attitude. Ask any IR manager. Or, for that matter, ask Wikipedia.
  21. Awesome. How to fix this? • If you know me,

    you probably already guessed “process.” Good for you! • Your job: figure out a process that works for your participants, test it with them to refine it, document it, and force your system to accept it. • That last part gets technically tricky, I won’t lie. I personally fought DSpace developers FOR YEARS to get usable deposit interfaces… and lost. I also lost a job with DiscoveryGarden because I hated Islandora interfaces… and said so. • With DSpace, I ended up doing a lot of deposits on depositors’ behalf. So it goes. A lot of the software in these spaces is so, so terrible. • Investigate your system’s APIs, batch-work mechanisms, and other technical infrastructure to see if there’s a way to make it work for you… instead of against your participants. I used DSpace’s export-import feature a lot for this, also SQL access to its database (for batch-fixing metadata, usually).
  22. Okay, I have a process. Now what? • Document it,

    of course. • If you have a tech-writing background, you can skip this bit of lecture. Most of us don’t, though. (I don’t myself! I had to learn this the hard way.) • One not-bad method of documenting a thing: • Do the thing. • Write down how you did the thing, as clearly and completely as you can. (You may be able to do this while you are doing the thing. I find this helpful because my memory is a steel sieve, but it isn’t necessary.) • Split your writing into individual instructions. • Illustrate instructions and provide examples, where that's helpful. • TEST YOUR INSTRUCTIONS WITH PARTICIPANTS. Fix whatever’s confusing or blocking them. Repeat. • Add troubleshooting instructions as needed.
  23. Docs needn’t be text! • (I know, right? But this

    was a revelation to me. Yes, I am hyperverbal.) • Oscar Grady Library, Saukville WI, has a personal-media- archiving lab. It’s extensive and impressive! • It inspired a lot of RADD, PROUD, and PRAVDA. • Expecting all their public-service staff to learn how to use all the equipment was not reasonable. • So here’s what library director Jen Gerber did: • For each type of media the lab can handle, a librarian recorded a screencast of how to work with it. • Those screencasts were posted to YouTube. • When someone comes in to use the lab, desk staff are trained to ask which kind(s) of media the patron has, and point the patron to the corresponding video(s). It's worked great for them!
  24. Be aware, humans are bad at: • Following instructions •

    Doing repetitive tasks consistently • Thinking their way into someone else’s situation (vitally important in metadata construction), much less into how systems work (ditto) • A lot of the training we give you in LIS 602, LIS 632, and LIS 651 is specifically aimed at correcting these very, very human tendencies! • Your participants won’t have had that training. Document accordingly! • Don't try to make them into mini-info-pros (it won't work), but… • … do try to anticipate mistakes they’ll make, and try to head them off.
  25. Read and edit docs for: • Good things • Examples.

    Many people (self included) skip straight to these, only going back to the instructions if the example isn’t enough. • Screenshots and (if physical equipment is involved) photos. • Good layout and design! Nobody likes doublespaced Times New Roman. • Consistent structure. • Not-so-great things • Info-pro jargon. NEVER SAY THE WORD “METADATA” TO A SUBJECT-MATTER EXPERT (unless you know they already know it). • Forcing the participant to make a lot of decisions. Whenever possible, take decisions out of their hands. (This often occurs with software configuration. Set the defaults properly to begin with! Don't make the participant mess with them!)
  26. Training • The time-honored method: watch me, do it with

    me, do it yourself with me watching, do it alone. • It’s time-honored because it works. • Use training to evaluate your process and docs, too! • Is there a way to make it easier? • Do the docs cover the place where your participant got stuck? • Did your participant do something wrong that surprised you? Go back and make sure the docs cover it! • Rabbit hole to avoid: One Million Interrupty Questions. • (I'm fighting this with RADD right now and it’s deeply frustrating.) • Sometimes you have to broken-record this: Read the docs. Read the docs. Read the docs. Read the docs. Read the docs. Read the docs. Read the docs.
  27. Crowdsourcing • If you have a set of tasks and/or

    materials that carries interest for folks, this can be a win all ‘round. • Many people like to contribute time and effort! • If there’s a defined interest group for the materials (e.g. Civil War aficionados) they can be excited to improve access to relevant materials. • If you’re thinking “woo-hoo, free labor!” STOP. Check your ethics before you wreck your ethics. • What are your participants getting out of this? The answer had better be better than “the warm glow of working for us for free.” • Software for some types of crowdsourcing is out there. • Transcription especially, but there are a few decent metadata tools.
  28. “Hope labor” • (credit for this phrase goes to Dr.

    Miriam Posner) • Free work because “it’s a good experience” or “it’ll give you a leg up on the job market/for graduate school.” • Sometimes the “leg up” is unfortunately true. • Why does the iSchool require a practicum for the MA? Because it's not just a leg up on the job market — we have good reason to believe that plenty of students may be unemployable without it. • With a practicum, we also have some ability to do quality control over the experience; we can't do that with jobs. • Sometimes it’s the purest empty rationalization. Check your ethics, again. • As a grant reviewer, I’ve heavily questioned apps over exploitation of student labor. It’s not okay. • PAY YOUR INTERNS. Their free labor only devalues yours.
  29. Thanks! Copyright 2019 by Dorothea Salo. This slidedeck available under

    a Creative Commons Attribution 4.0 International license.
  30. It’s so unfair. • You did all this work to

    do a project or launch a service, and you can’t just let people find it and be amazed at your awesomeness? • I know. It’s not fair at all. But no, you can’t just sit back and rest on your laurels. You gotta get the word out!
  31. The basics • Portals, catalogs, finding aids, and aggregators •

    The Digital Public Library of America is crumbling, but it’s still the biggest game in town for now. If you can get your collections in there, do it. • If you can get a collection record into a library catalog, do that. If you can link to a digitized collection from the collection's finding aid, do that. • Traffic redirectors: search engines, Wikipedia • Design your collection website for decent search-engine optimization. (Is this a reason I suggested you take Information Architecture as a companion to this course? Why yes, yes it is!) • Good metadata matters! Search engines vacuum up text! • Carefully (Wikipedia bans “spammers” and reverts their edits) add links to your materials to appropriate Wikipedia pages.
  32. Reaching communities • Remember that list of communities from the

    beginning of this module? • Those are all communities who should know about your project. Find ways to tell them about it! • (Well, except for the trolls and hackers. Don't tell them.) • Email, social media, news media, in-person contacts, public demos and talks, whatever works for a given community. • Don't assume “oh, the newspaper / radio station / TV news wouldn't be interested.” Write a press release and send it! The worst thing that can happen is nothing — and you might be pleasantly surprised! • Be open to serendipity and new communities. You never know! • (That's why I assigned you the Terras piece.)
  33. Social media • You need a strategy for this (and

    goals, as Sarah Werner points out). Your organization needs to treat it as an ongoing opportunity and responsibility. One-and-done doesn't work! • Really great example: the Museum of English Rural Life (@TheMERL on Twitter). Find their strategy docs (in my Pinboard) — they’re genuinely brilliant. • Have clear end-user licensing! Make crediting easy! • People hate it when they can’t tell what they're allowed to do with your collections. • They hate it even more when you play dog-in-manger. No copyright barratry (claiming a copyright you don’t own or that doesn't exist), okay?
  34. Targeting educators • I’m thinking K-12 here, but much of

    this applies to higher ed also. • It’s a great idea! But you have to make it easy for them. • Teachers are incredibly overworked people under unbelievable constraints. • Package up (some of) your materials into easily-adoptable lesson plans. • Student-assessment materials a huge bonus! • Align those plans with known requirements, e.g. the Common Core, and write that alignment out for them. • Use big events — National History Day, one-community- one-book events, local-history events — to your advantage.
  35. Something you’ll run into • “We can't digitize that. They’ll

    misuse it!” • If this is coming from a community represented in your collection, hear and heed it. They get a veto. • If it’s coming from your colleagues… ugh. And it may! • Try “How is that different from what they can do with our physical collections?” Lots of times it isn’t! • Try “Many researchers will use this collection respectfully and extend our knowledge with it. Doesn't that make this worth doing?” • Try reaching an agreement on how you will contextualize the materials, e.g. in a collection description. • Try NOT to get railroaded into copyright barratry or unnecessary embargoes or end-user restrictions. Fight for your DCs!
  36. And, speaking of misuse… • Trolls and hackers. You can’t

    avoid them, so you have to plan against them. • We’ll talk about some technical pieces of this — backups and ransomware — in another module. • Plan to moderate any public-input mechanisms (comments, trackbacks). Don’t end up like Tay the chatbot! • Crowdsourcing? Build in quality checks. • No transcription should go into your collection without at least one check step! Double-keying with a comparison step is wise! • Political ax-grinders? Ugh. Have your context and your story ready. Better to have it and not need it than… • (Essentially this is a collection challenge. Treat it accordingly; there are best practices for this already!)
  37. Assessment: telling a story • At some point post-project, you

    will have to justify the time and money expended. Expect that. Plan to write the Story of Your Project. • Funders and employers will want this done! Know your communities! • This is a big reason you set outreach goals from the start. YOU set the terms on which the project will be assessed. • If you don't set those terms, any political enemies you or your project have will seize on whatever measures make you look bad. Been there, done that! • Sarah Werner suggests goals to consider; I won’t repeat. • Once you settle on a goal, though, you need to figure out how you’ll measure it. Again, it's about setting the terms!
  38. Thanks! Copyright 2019 by Dorothea Salo. This slidedeck available under

    a Creative Commons Attribution 4.0 International license.