Upgrade to Pro — share decks privately, control downloads, hide ads and more …

A culture of safety

Elliot Murphy
September 27, 2016

A culture of safety

Discussions of security, compliance, and safety all too often start and end with technical controls. In this talk I'll share experiences helping multiple companies meet the demands of regulators and customers for safeguarding medical data without driving the engineers to frustration. No buzzwords or FUD, just real human experience of overcoming the challenges to applying infrastructure as code in traditionally conservative domains.

Elliot Murphy

September 27, 2016
Tweet

More Decks by Elliot Murphy

Other Decks in Technology

Transcript

  1. Minimizing misery A culture of safety in
    regulated environments
    Elliot Murphy @sstatik kindlyops.com
    Worked for a couple of decades now on mission critical systems: telephone systems for emergency centers, backup software, databases, operating systems, healthcare
    web app for mental health treatment, as a consultant to regulated companies: increasingly been exposed to policy, and the hard work involved in making rules for others
    or trying to figure out: are we doing a good job? Minimizing misery really is about the people involved. Safety comes from people! If people are irritated they won’t be
    doing a good job. I want to tell you why you should care, and then 3 big ideas.

    View Slide

  2. Regulations are
    increasing
    October 2105 Court Justice of the
    European Union repeals Safe Harbor
    April 2016 European Commission passes
    General Data Protection Regulation
    2016 US Office for Civil Rights begins
    Phase 2 HIPAA audit program.
    Elliot Murphy @sstatik kindlyops.com
    Increased regulation means increased scrutiny while we are just trying to work, and can lead to deep frustration. Sometimes regulations are applied in stupid ways,
    sometimes regulations are misunderstood, but they aren’t all bad. Sometimes engineers see extra effort and cost as waste, and this is a hypocritical and unhelpful
    perspective. Tell me about how rules for protecting customers are wasteful and then tell me about how you aren’t using load balancers & replicated data for high
    availability. Instead: lets talk about some ways to reduce the frustrations.

    View Slide

  3. 1. Connect controls
    to requirements
    Empower engineers to
    understand how to
    completely change the
    controls while meeting the
    original requirement.
    Elliot Murphy @sstatik kindlyops.com
    Sometimes you have regulations, or laws, that require something specific. Usually they are a bit vague about how to accomplish it: example that is easy to understand:
    you must use encryption in transit and at rest. Many different ways to accomplish it: GPG, TLS, encryption before/in/under database.

    Then you get requirements that change over time: they come from a customers well-intentioned security policy, or from changing guidance about interpretation of the
    regulations. Crucial that you maintain a map of where each regulation came from. STORY: about one state not allowing vendor hiring of people with a misdemeanor
    criminal record. Maintained a tight connection between that control/hiring policy and that customer. Options: not hire someone, hire and define their job such that it didn’t
    require access to this customer, fire the customer. Any of those options would leave us compliant. Bad thing to do would have been to enshrine this rule in company
    policy and forget where it came from. Rules need half-lives, context.

    View Slide

  4. 2. Solve for intent
    Some policies are out of date,
    don’t use them as an excuse to
    do a bad job. Remember the
    humans that have to use the
    system.
    Elliot Murphy @sstatik kindlyops.com
    Don’t just meet the letter of the law, remember there are real people, real human factors involved in systems. Story of Beth, difficulty with muscle control, in a power chair,
    deaf, uses video phone. Trying to sign up for Glide, video instant messenger, required SMS confirmation. Beth doesn’t have a cell phone. No problem, can make a voice
    call instead. Uses a video interpreter, automated system spits out 4 numbers before the interpreter can get Beth on the video phone.

    View Slide

  5. 3. Always be auditing
    Search for Racist Algorithms
    Accountable Algorithms
    http://papers.ssrn.com/sol3/
    papers.cfm?
    abstract_id=2765268
    Elliot Murphy @sstatik kindlyops.com
    Please please be building auditing into the systems. It’s a two-fold protection, it protects customers and it protects operators from wrongly being accused. Auditing is
    particularly useful when trying to DevOps: grant broad permissions and then review how they are used. Auditing We’re rushing headlong into a pretty serious problem,
    which is that auditing is getting more and more impossible due to the volume of data and the fact that more and more is being done by algorithms. There have been
    multiple incidents in 2016 when we have been shocked at the results coming from algorithms, and it’s crucial that we figure out how to audit our algorithms and data.

    View Slide

  6. Elliot Murphy @sstatik kindlyops.com
    https://speakerdeck.com/statik/a-culture-of-safety
    All images from unsplash.com
    1. Regulations are part of our life

    2. Connect controls to requirements

    3. Solve for intent

    4. Always be auditing

    5. It is possible to work in an environment where risk is present and still enjoy it.

    6. Starting a podcast called safety at speed, with a focus on the human side of things. I’ve been in touch with some of you and asked for interviews, if you would like to
    share stories please come say hello and I’d love to interview you.

    View Slide