Why training? ✦ The motivation is usually not improving security. Maybe it should be, but it isn’t.
✦ The motivation for workplace infosec training is usually COMPLIANCE, legal or standards-based.
✦ Legal: if some law constrains an organization to meet minimum security standards, it will usually stipulate some kind of employee training. HIPAA, for example, requires training all employees as part of the hire/onboarding process. GDPR also has training requirements.
✦ Standards: PCI, for example, requires documentation of security practices and procedures, including in employee manuals.
✦ It may also be cleanup after an incident.
✦ With all the hasty ill-considered panicky fl ailing that implies.
✦ And the one-person-messed-up-but-we’re-all-stuck-doing-this problem.
This has implications. ✦ Viewed as unimportant check-the-box exercise
✦ Including by org-internal infosec folks! They could treat this as an opportunity, but more often they roll their eyes at it.
✦ If training is developed internally, it’s often done by people who… don’t teach well.
✦ So it’s too techie, or condescending, or communicated ineptly, or…
✦ Teaching, like most work, is a set of learned skills not present at birth.
✦ If training is outsourced (as it often is), there’s no connection to the local environment.
✦ Production values are likely higher, but it’ll be easy to sco ff at because the examples will feel farfetched and the systems discussed won’t be familiar (or named according to local practice).
✦ While I understand the need to raise the fl oor, repeated 101-level training does not help people learn more advanced (and useful!) concepts, tools, or behaviors.
✦ It sure does teach them to despise infosec, though. How remarkably counterproductive!
✦ Moral: let people “place out” of basics
✦ e.g. through quizzing them up-front and exempting those who pass
This also means knowing people’s practices. ✦ As we’ve discussed, people’s approach to many infosec matters falls into known patterns.
✦ Infosec training rarely takes that into account.
✦ “Choose a strong password!” instead of “Here are common password practices that are easily breached, so don’t:” or even “here’s why the system’s password tester rejects certain kinds of passwords:”
✦ “Don’t click on links in emails!” when that’s clearly impractical advice.
✦ (I’ve seen the above in UW trainings. UW constantly sends out emails with clickable links. Sigh.)
✦ Infosec training rarely if ever starts with ethnographic-style inquiry into the org.
One size fits no one ✦ Typically, everybody in the org gets the same old Infosec 101 training.
✦ This completely ignores di ff erent behaviors, threats, and risks associated with di ff erent roles.
✦ Example: As an instructor here, I don’t need to know about infosec in hiring. Not my problem! I don’t do hiring paperwork!
✦ Do I need to know how to keep student emails to me secure? Heck yeah I do; if I don’t do that, I risk harm to students and violating federal law (FERPA). Our HR person also needs to know this, because we hire student employees.
✦ And because people (wrongly) think that the same old Infosec 101 is everything they ever need to know… they’re vulnerable to role-based attacks.
Focusing too much on tech ✦ What use is the strongest password there is, if people aren’t prepared to resist attempts by social engineers to get them to reveal it?
✦ PHISHING. Phishing phishing phishing.
✦ Did you know that business-email compromise (spearphishing aimed at conning people into various kinds of false payments) is collectively the most expensive infosec fail in the US? Now you do.
✦ Yet most Infosec 101 is all about tech stu ff .
✦ This is even worse because social engineering is… actually hard to make boring?!
✦ And it sets up a clear us-against-the-world feeling, which is socially useful for infosec folks and the org in general.
✦ (Though be careful to discuss insider threat also!)
In fact… ✦ … in my experience, it’s a lot easier for people to understand attacks if they fi rst understand attackERS, and their techniques and motives.
✦ Would this have a lot to do with the topic sequencing in this course? Yep, you betcha.
✦ Use the fl uency heuristic (people remember and repeat what they’re familiar with) to help.
✦ Get people familiar with an “attacker” persona.
✦ Would this be why I teach you Alice/Bob/Eve and use the phrase “garbage human” a lot? Yep, you betcha, at least in part.
✦ (Alice/Bob/Eve is common infosec jargon; that’s another reason I use it.)
✦ And use real-world case studies, ideally from peer orgs in the same or similar industry.
2021 research on infosec training: ✦ “[E]mployee perceptions of [training] programs relate to their previously held beliefs about:
✦ “cybersecurity threats,
✦ “the content and delivery of the training program,
✦ “the behaviour of others around them, and
✦ “features of their organisation.” (This amounted to the usability of security measures and perceived necessity to work around them… which should sound familiar to you by now!)
✦ From:
✦ Reeves, Calic, and Delfabbro. “‘Get a red-hot poker and open up my eyes, it's so boring’: Employee perceptions of cybersecurity training” Computers & Security 106 (2021), https://doi.org/10.1016/ j.cose.2021.102281
✦ Best article title ever, or best article title EVER?!
What do people believe about infosec threats? ✦ We’ve seen a lot of it before. But for the record, here are themes the article found:
✦ “I already know all this, or at least enough to make my own decisions!” (Another pitfall of neverending Infosec 101… people have no reason to realize everything they don’t know, because they’re never shown anything beyond the 101 level!)
✦ “I don’t need to understand this, even if I could! The system needs to handle its own security! Usably!” (Attributed to “younger users.”)
✦ Password issues. (Including one org where the training recommended a password manager… which the org refused to let employees install or use.)
Content and delivery ✦ Few surprises here: it ain’t great. (For the record: I very don’t love our trainings here.)
✦ People’s mood at training time also matters, unsurprisingly.
✦ Misery loves company: group training preferred to individual.
✦ I have sympathy! Of course I do. High production values take a lot more time and e ff ort than I have.
✦ Add in internationalization and accessibility requirements, and the workload multiplies by… a lot.
✦ I don’t have sympathy for:
✦ Measures that try to force attention — e.g. online trainings that won’t advance unless they’re the topmost window, video that only plays at 1x speed. Get over yourselves, trainers.
✦ Irrelevancies. Again, know the context and work within it!
✦ Unexplained or unnecessary jargon. Condescension. Scare tactics.
The behavior of others ✦ We’re social animals, we humans. We behave as the others around us behave.
✦ In almost all organizations, “the others around us” are not infosec folks!
✦ And infosec folks tend to be pretty siloed. Those of you with jobs: do you know who’s securing your org’s systems? By name? I’d be surprised if you did. Who’s the current UW-Madison CISO, for that matter? I don’t think I’ve mentioned…
✦ So we enable each others’ poor infosec hygiene.
✦ Including spreading misunderstandings and broken mental models.
✦ It’s worse if the poor hygienist is our boss. They can demand that we do the wrong thing!
✦ Where training con fl icts with human social behaviors, training will lose. No question about it.
Okay, two caveats: ✦ People will use convenient, available, usually informal comms channels to ask infosec questions.
✦ They come up not-infrequently on the UW-Madison subreddit.
✦ It can make sense to (discreetly and non-creepily) keep an eye on those.
✦ Setting up alerts on words (e.g. in a work Slack) may make sense.
✦ If there’s an infosec person (or a connector, as before) with very good social judgment and skill at explaining, answering questions or weighing in on situations may be a way to build useful social trust.
✦ But the good social judgment is vital!!!! Just horning in on every vaguely-relevant conversation (or worse, posting canned answers) won’t help — in fact, it’ll hurt.
✦ (I do answer infosec questions on the sub when I see them. My answers get upvoted, so my social acumen seems… mostly okay?)
Cost in time, productivity, and trust ✦ UW-Madison employs over 20,000 people.
✦ Imagine that each person spends just one minute considering that fake phish. That’s over 20,000 minutes of work time, or 333 hours.
✦ I don’t have an average salary to hand (median would be better anyway), but let’s assume it’s something like $40/hour. That’s $13,320 plus the cost to build and fi eld the test and analyze the results, just for one dang phishing test.
✦ Notice that I’m not even counting (re)training time here. It adds up.
✦ Nobody likes phish tests. (I actively resent UW System for doing them.) Does your org need more reasons for employees to be mad at it?
Is it worth it? ✦ No training or testing has been shown to eliminate successful phishes altogether.
✦ Research: counterintuitively, repeated phishing tests plus non-mandatory training can make people MORE likely to click on real phish.
✦ Lain, Kostiainien & Čaron 2021
✦ Why? Not totally clear, but it appears the a ff ected folks think the training means the org is protecting them, so they don’t actually have to worry about phish as individuals.
✦ In other words, folks completely misunderstood the point of the test! That… doesn’t bode well for their infosec hygiene generally, and it also doesn’t say much for any training they’ve gotten.
Cruel pretexts ✦ Money, sometimes life-changing money, aimed at folks who are poor and/or poorly paid
✦ Real-life examples: COVID bene fi ts, bonuses, health-care promises, discounts — that didn’t exist.
✦ I saw an infosec pro on Twitter point out that the root cause here is actually lousy pay and bene fi ts. That’s absolutely correct; treating people poorly and underpaying them creates a lot of infosec risk.
✦ Higher ed: grade- or disciplinary-action-related scare tactics
✦ Phishes ostensibly from the [Big] Boss, often designed to make person think they might be (or get) in trouble
✦ I’m waiting for somebody with anxiety to sue under the ADA over this. I am not a lawyer, but I have to think it’d be a workable case.
over phishing-test results ✦ It’s happened. It shouldn’t.
✦ It’s an absolutely sordid and unethical idea.
✦ One reason: phish people often enough, and everyone will eventually click. Yes, pretty much everyone. (Lain, Kostiainien & Čaron 2021) Di ff erent people fall for di ff erent pretexts, but everybody’s vulnerable to something.
✦ Another reason: disincentivizing reporting, again — if people are afraid to report infosec issues because they fear punishment, incidents are more likely, higher-severity, and longer-lasting.
✦ A third: disciplining people you intentionally deceived is just gross. It’s a garbage-humanny, unethical way to treat people.
✦ Keep in mind also that people perceive “you failed; go take another training” as punishment.
Better ideas ✦ (mostly from the Lain et al. piece)
✦ Make it easy to report suspected phish. Then take people’s reports seriously!
✦ It turns out that if reporting is easy enough, people will get in the habit of doing it (habituation for the win, for once!), and won’t get sick of it.
✦ Averaged over the organization, these reports are indeed good advance indicators of phishing campaigns. (I think it’s possible to improve on this, by evaluating who’s a good phish reporter, then paying extra attention to their reports. Future research!)
✦ Positive reinforcement: reward phish- fi nders!
✦ Including those who initially fall for a phish, but report it quickly anyway.
✦ Tell them when they detected real phish, so their phish-radar improves.