Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Reverse Engineering the Wetware - Understanding...

Avatar for Alexandre Sieira Alexandre Sieira
February 28, 2016
30

Reverse Engineering the Wetware - Understanding Human Behavior to Improve Information Security

The human mind evolved to draw quick conclusions for survival. Behavioral economists, like Daniel Kahneman and Dan Ariely, are publishing research on when, why and how decision making can be consistently and predictably irrational. You could say these researchers are reverse engineering the wetware, finding bugs and race conditions and disclosing them.People are key to an organization’s information security, even if you believe in the “people, processes and technology” tripod. People define and execute processes. People decide funding for, implement, operate and/or monitor the technology. Your adversaries are people. At least until we reach the AI singularity, that is.Until then, the aim of this talk is to present some of the counter-intuitive findings of behavioral economics research and their implications for how information security is handled at the organizational and market levels. Our hope is that the audience will find they could benefit from changing established, seemingly sensible and logical actions we all do to better match how the wetware actually works.

Presented at BSides San Francisco 2016.

Avatar for Alexandre Sieira

Alexandre Sieira

February 28, 2016
Tweet

Transcript

  1. Reverse Engineering the Wetware Understanding Human Behavior to Improve Information

    Security #retww Alexandre Sieira @AlexandreSieira CTO @NiddelCorp Principal @MLSecProject Matt Hathaway @TheWay99 Products Leader @Rapid7
  2. Let’s Start On A Very Serious Note Because we are

    very serious people. Seriously.
  3. The People-Process-Technology Triad • People define and execute processes. •

    People decide funding, implement, operate and/or monitor the technology. • Your adversaries are people. • People are key.
  4. Yes, people. Really. • A lot of humans are needed

    to keep any organization secure • Executive teams – funding (or lack thereof) • Security teams – build the process, use the technology • IT teams – implement controls, operate infrastructure • Security vendors – ummm… perfect people, of course, right? • The end users – your organization wouldn’t exist without them • Blame game achieves nothing • Executives don’t get it. • It’s the shitty tech, not us. • It’s the shitty team, not the tech. • I took that dumb training, but I want my bonus.
  5. Why do Humans Think Like They Do? • Wetware evolved

    to meet survival requirements of hunter-gatherer lifestyle and threat model: • Low latency, near-real-time decisions; • High aversion to loss; • False positives have much lower cost than false negatives; • Social living in small communities where it is easy to keep track of reputations. ... so not really surprising to find heuristics don’t work as well now.
  6. Why do Humans Think Like They Do? From ”The Art

    of Thinking Security Clearly” RSAC USA 2015 presentation by @apbarros System 1 operates automatically and quickly, with little or no effort, no sense of voluntary control and no self-awareness. ”System 1 runs the show, that’s the one you want to move.” – Daniel Kahneman System 2 allocates attention to the effortful mental activities that demand it, including complex computations. Its operations are often associated with the subjective experience of agency, choice, and concentration.
  7. You Are Not Above Average At Everything • Ever heard

    of this town? Yeah, me neither. • “…all the women are strong, all the men are good-looking, and all the children are above average.” – Garrison Keillor • I’m a better than average X • I am better than the sysadmin who got pwned at RSA.
  8. Experts Who Are Actually Amateurs • Successful lawyers of [major

    law firm]: I spend all day avoiding traps. No phishing email is going to work on me. • Some silly finance team wired $1 million to help the CFO close a last- minute deal in China. How could they fall for that?
  9. Experts Who Feel Like Amateurs • Can’t get through all

    your email? • In awe of a colleague’s knowledge? • Hungover and can’t focus? • Feel underqualified and at risk of being exposed at any moment? • Recommended reading: @sroberts “Imposter Syndrome in DFIR” blog
  10. “Expert” Reality • If you don’t think you can be

    outwitted, you probably already are • If you don’t ever feel like an imposter, you’re not challenging yourself to get better
  11. Professional Certifications • Can be objectively useful as signals and

    filters in professionals / organizations market (information asymmetry); • Getting a certificate causes you to overestimate your own expertise; H/T to @fsmontenegro • Positive bias to vendor and/or other certificate holders: • “If I went through the effort of certifying on that vendor’s product and I consider myself a good person, then that vendor must be good too”; • Endowment effect; • Might impact vendor selection and/or hiring processes.
  12. More Tired == Less Rational • Tired people are less

    likely to make rational decisions: • Less oversight from system 2; • Less capacity to avoid and resist temptation. • Think IT maintenance windows and DFIR: • Mandatory down time for people involved? • Follow the sun teams working on their own mornings? • References: • https://hbr.org/2016/02/dont-make-important- decisions-late-in-the-day • http://sloanreview.mit.edu/article/why-sleep-is-a- strategic-resource/
  13. You Will Be Judged Unfairly • Think the team was

    at fault for Target/Neiman Marcus breaches? • February 2014: • “Hackers Set Off 60,000 Alerts While Bagging Credit Card Data” • December 2014: • Lawsuit not dismissed because of “disabling certain security features”
  14. Your Heart Is Not Scientific • Always had a “gut

    feeling” about something you couldn’t prove? • Know in your heart that every time you got sick, you waited too long? • Maybe you have a specific event which always seems to reveal the truth?
  15. We Search For Explanations In What We Have • Ever

    heard an educated person explain that their team only lost because they didn’t stick to their ritual? • What if simply checking the hash against VirusTotal was enough to find malware the first time, so you kept doing it?
  16. Test. Trick. Get Attention. Trigger System 2. • Still not

    considering phishing your employees? • It might be the only way they’ll ever think twice • Incident response exercises seem cheesy? • Consider using random data or fake incidents to go further
  17. Why and When do Humans Cheat? • Rational humans would

    cheat when cost- benefit analysis merited it: • Personal gain from cheating; • Chance of being caught; • Penalty if caught. • Most actual humans cheat when: • There’s a gain for self or others; • It’s possible to justify or rationalize; • ”Fudge factor” model.
  18. Dan Ariely - Dishonesty • ”Distance” makes cheating easier to

    justify: • Moving the golf ball: hand << foot << club; • Same monetary values, different behaviors: • Stolen Skype account and charges against PayPal account • “Would this person have taken cash from my wallet?”
  19. Cybercrime Feels Victimless • Just as spending tokens removes a

    lot of the pain • Only writing software or sending emails prevents criminals from feeling the guilt • The vast majority would never rob a stranger at knifepoint
  20. Conflicts of Interest and Self-Deception • Dentists that purchase CAD/CAM

    machines prescribe more unnecessary treatments that use it; • Art appreciation under fMRI experiment – positive bias towards vendors. • ”Perfectly well-meaning people can get tripped up on the quirks of the human mind, make egregious mistakes, and still consider themselves to be good and moral.”
  21. WTH Effect • The more we perceive ourselves as immoral,

    the more immoral behavior we allow ourselves; • ”Normalization of Deviance” effect over time; • Pay attention to feedback from outsiders, make data-driven decisions when possible; • ”Resetting” events / rituals might help overcome this: confession, New Year’s resolutions, Truth and Reconciliation Commission in South Africa.
  22. Payback and Altruism • Coffee shop experiment shows revenge can

    be unconscious justification for dishonesty: • 45% returned extra money if experimenter was polite; • Only 14% if impolite. • Social utility also a factor: • ”Robin Hood” effect; • Impact of not cheating on peers or underlings due to group quotas and bonuses.
  23. The Value of Good Examples ”The (Honest) Truth About Dishonesty:

    How We Lie to Everyone---Especially Ourselves”, Dan Ariely
  24. Priming the Morality Cache • If you remind people about

    moral codes or their responsibility to act correctly cheating is reduced: • Honor code; • Religious commandments; • Personal commitment. • Simply changing signature to top of insurance form instead of bottom increased self-reported car mileage by 15%; • Implications for process and UI design.
  25. Designing Rules and Incentives ”When the rules are somewhat open

    to interpretation, when there are gray areas, and when people are left to score their own performance – even honorable games may be traps for dishonesty” ”The (Honest) Truth About Dishonesty: How We Lie to Everyone---Especially Ourselves” – Dan Ariely http://www.zdnet.com/article/facebook-engages-in-instagram-bug-spat-with-security-researcher/
  26. But At Least We Can Trust Vendors, Right? You must

    be nodding right now. Why else would this section be here?
  27. Don’t Trust Statistics… Without Specifics • “…the malware that was

    used would have slipped or probably got past 90 percent of internet defenses that are out there today in private industry and [would have] challenged even state government.” -- Joseph Demarest, assistant director of the FBI’s cyber division • “FBI: 90% Of US Companies Could Be Hacked Just Like Sony” -- Business Insider
  28. Why Does FUD Marketing Still Exist? • Ever hear “such

    a dangerous time we live in”? • Did you FLY HERE?! • What if you told your CFO a solution would prevent stolen HVAC accounts from authenticating?
  29. InfoSec marketing rings a bell? ”The (Honest) Truth About Dishonesty:

    How We Lie to Everyone---Especially Ourselves”, Chapter 7
  30. The Experts Again. From Adjacent Fields. • Ever heard of

    “green lumber” trading? • An expert on green lumber would make a great trader, right? • NSA retiree == InfoSec expert? • Intelligence official [asked about Snowden’s skills]: “It’s 2013 and the NSA is stuck in 2003 technology.” • Buy our detection because… HD!
  31. Always Consider What’s Unsaid • Would you buy something that

    “eliminated 90% of herpes”? • What if it said “10% of herpes survives”? • So now “blocks 90% of malware”? • “Pricing starts at $99”?
  32. Find Leadership Who Lets You Admit #FAIL • Ever feel

    like it would be a waste to not finish your meal at a restaurant? • Think you’ve already committed to this talk, so you should stay? • How about “no more money for detection until the SIEM ROI is proven”?
  33. Should You Always Push For ‘Build’ Over ‘Buy’? • Never

    give a product consideration because of a shitty demo? • IT and Security software UIs have been terrible for a decade, so it won’t ever change, right? • Think your security team can build better software every time?
  34. Human Minds Find Patterns In Randomness • Confident you can

    differentiate between random and malicious? • Remember iPod shuffle mode complaints when a song was repeated? • Manually reviewing logs all day would probably be best for finding patterns, though…
  35. Look To Take The Conclusion From Humans • Battling cognitive

    biases is hard • We will make the wrong decision often and consistently (as covered here) • Be aware of this and strive to call yourself out • Humans are best at deciding what to do • Identifying symptoms - machines • Identifying root cause – machines • Identifying solutions - humans
  36. Recommended Reading • The (Honest) Truth About Dishonesty: How We

    Lie to Everyone---Especially Ourselves and everything else by @danariely • The Black Swan: The Impact of the Highly Improbable and everything else by Nassim Nicholas Taleb • Thinking, Fast and Slow by the legendary Daniel Kahneman • Economic aspects of tech certifications by @fsmontenegro: • https://fsmontenegro.wordpress.com/2016/01/03/professional-certifications-information- asymmetry/ • https://fsmontenegro.wordpress.com/2016/01/13/professional-certifications-behavioural- economics/ • RSAC USA 2015 – The Art of Thinking Security Clearly by @apbarros https://www.rsaconference.com/events/us15/agenda/sessions/1531/the-art-of-thinking-security-clearly