$30 off During Our Annual Pro Sale. View Details »

Overcoming Regulatory Burden in Startups - DevOpsDays Boston 2015

Elliot Murphy
September 16, 2015

Overcoming Regulatory Burden in Startups - DevOpsDays Boston 2015

A discussion of why USA healthcare regulations apply to more software than you realize, why people working in DevOps should care, and tactics for building fluency and proficiency in security and privacy.

Elliot Murphy

September 16, 2015
Tweet

More Decks by Elliot Murphy

Other Decks in Technology

Transcript

  1. Overcoming Regulatory
    Burden in Startups
    Elliot Murphy @sstatik [email protected]
    Why it matters in general

    Why you should personally try to level up on security/privacy/safety

    Current scene (what software subject to regulations, who enforces regulations)

    Specific tactics for building fluency and proficiency

    Typically two broad sets of regulation come to mind: PCI and HIPAA.

    Not talking about PCI, because I’m pretty sure all of you work at companies that have figured out how to charge money for things. What I want to talk about today is:
    (next page)

    View Slide

  2. Battling risk & cost in
    healthcare software
    Who here works on healthcare software?

    I mean who works on software that is governed by regulations about safeguarding personal data connected to healthcare records (audience interaction/voting)

    I spent 3 years working in a startup working very hard to comply with regulations. We estimated that in a 10 person SaaS company, compliance was 30% of our costs.
    That burden actually stops people from building new products that could make lives better because of the cost/complexity - this stifles innovation in purpose-built
    healthcare software. This means that general purpose software comes to market faster and has much greater utility than purpose-built healthcare software, because it is
    perceived to have less of a burden, less compliance overhead to develop. Yesterday Jennifer mentioned someone using PagerDuty to monitor glucose levels.

    As the laws have changed over the last couple of years, suddenly these regulations apply to *much* more software than before. And when legislation catches up with IoT,
    there will be even more. In 2013 when Amazon and Google suddenly started signing HIPAA BAA agreements after years of insisting that they didn’t have to - why?
    Regulations now apply to vendors and subcontractors, even if you didn’t sign an agreement with your customer.

    View Slide

  3. security and privacy typically
    neglected when running
    general purpose web services
    Assert that today’s typical engineers are much better at scaling than securing

    Assert that today’s typical application designs are much better at violating privacy than preserving dignity

    This is a form of local optimization, scaling hurts now, security hurts later, privacy is complicated. People have lots of work to do, there are very real costs to spending
    time on security and privacy vs features. Just like network configuration, the state of the art in security/privacy is lagging behind. The path of least resistance is to build
    and run insecurely, and with minimal privacy protections.

    View Slide

  4. Why work on this
    problem at all?
    Another name for security/privacy is SAFETY

    Vulnerable populations are growing: status, geography, gender, age, chronic conditions

    Those of us attending conferences are in the prime of our careers / sweeping generalization and oversimplification: least vulnerable of our lives.

    All will revert to vulnerability before they die, and right now we have the most power to change the status quo.

    Regulations exist to protect vulnerable populations. HIPAA enforcement by Office of Civil Rights (OCR).

    Practice or the practical application of scientific knowledge is falling behind the research - 12 years or more behind. Thesis is the reason it is so far behind is becase
    when we declare that software is for healthcare, it is

    Regulations are spiraling in complexity, and software is oozing into health-affecting roles faster than we are able to reason about it. Software becomes part of the care
    pathway whether it was intended to or not.

    One example: HelpScout, helpdesk software, obviously not health care related: can’t have customers in healthcare without being HIPAA compliant! Those requirements
    roll all the way down the chain. If you are going after a sizable market at all, you will need to care about HIPAA compliance.

    An example of my claim that the current state of the art in app security is not good enough for vulnerable populations: Glide: Beth’s story

    View Slide

  5. TAKE PICTURE I LOVE YOU. Promise from community - I will build empathy for vulnerable people, and I will apply that empathy to all software I work on.

    My friend Beth. Beth copes with two profound vulnerabilities (in addition to being poor):

    Very limited muscle control, she is in a power chair and cannot hold her head up.

    Deaf - communication is a constant challenge.

    Very bright and technically savvy - see computer in the background. Story has two morals: all software can become part of the healthcare chain, and we suck at
    operating secure web services.

    Example of challenges caused by communication: staff lifting into her bed recently tore the tendons in her shoulder so badly she could not sign with dominant hand, and
    needed surgery. Because of limited communication, she was unable to report the incident. Staff denied she needed medical attention.

    In this life context, Beth was very excited to use glide.me. Not picking on glide, just a real life example, wonderful group of people doing their best to make an app that
    has been a surprise hit in the deaf community. Glide is video based messaging. Cross platform, closer to text messaging or chat than to a video call. Remember that
    Beth is poor, so she does have a tablet that has wifi access but does not have a mobile phone. Glide requires SMS verification to combat fraud in order to activate.
    Contacting support, no problem, they can call a land line. Beth does have access to a video phone, government pays for a voice to video relay service. The land line
    confirmation consists of a computerized voice spitting out four numbers as soon as the interpreter answers the phone, before the interpreter can get Beth onto the video
    phone. Eventually, by having some friends help defeat the system, Beth gets access, and can communicate with her friends and advocates. Beth can then seek
    assistance in getting care, which includes discussing risky details about who injured her. Risky means she has been threatened with being kicked out of the home for
    asking for equipment that can transport her without injury.

    View Slide

  6. Why us?
    Why should we, the people at DevOpsDays, care about this?

    Most of us don’t identify as working on “healthcare”.

    Plato's Allegory of the Cave

    People living inside the cave see shadows on the walls, caused by unseen actors outside the cave, and think the shadows are reality - don’t know the people outside the
    cave exist, don’t know any of the details. In the DevOps community, we see shadows, stories about security and privacy being complicated, and rules being
    burdensome, and we accept that indirect knowledge as reality. When one is painfully dragged out of the cave and exposed to additional details outside, they understand
    more of the world and what is causing the shadows. When they return the to cave with different versions of reality, people think they are damaged and refuse to learn. In
    this case, the reality outside is that the rules do apply to you, even though enforcement is currently poor. The utility of general purpose software is greatly outpacing the
    utility of special purpose software, and we should not try to stop it.

    View Slide

  7. DevSecNetCustomerOps
    One pop culture film that draws heavily from Plato’s allegory of the cave is the Matrix.

    This is (I think) the most interesting character in the story, a character that I find immensely inspiring as an archetype. The Keymaker can move between arbitrary worlds
    and contexts. The Keymaker can enter and leave many caves, without being dragged. Does not have the most power, but you could argue has the most ideal situation
    for being able to develop empathy.

    Of course empathy and understanding multiple worlds is key to DevSecNetCustomerOps. We are the ones in the throes of building and running the general purpose
    software which oozing into every facet of our lives, and we are the ones making decisions that ultimately will affect us personally as we transition back into “vulnerable
    populations”. We are the ones most able to understand all the tradeoffs involved in securely operating web services for the public, and in fact have the ability to improve
    security and privacy faster than it can be legislated. I think this might be called self-regulating.

    So thats why, and why us. Lets go through some doors.

    View Slide

  8. HIPAA & HITECH
    who enforces the rules?
    OCR: Office Civil Rights
    FDA: Food & Drug
    FTC: Trade Commission
    SAG:State Attorney General
    Private Lawsuits
    IoT regs haven’t hit yet
    You would think, that an even simpler question than “What are the rules”, is “who enforces the rules”? Turns out they are both complicated.

    OCR is responsible for enforcing security and privacy rules of HIPAA. Done when a complaint is received

    State Attorney Generals have overlapping jurisdiction, and started enforcement actions in 2010.

    Food and Drug is also responsible for medical devices, and has been grappling with understanding mHealth, or when does a smartphone become a medical device?

    Federal Trade Commission is responsible for consumer protection, and could get involved with large scale breaches of consumer privacy or mishandling of consumer
    data.

    Enforcement actions don’t necessarily provide restitution to the injured party, so private lawsuits come into play.

    View Slide

  9. Why us?
    Understandably, when first exposed to this reality, we might want to head back into the cave, try to kill anyone who wants to talk about these things.

    In vernacular of the matrix, take the blue pill, happy and ignorant.

    Lets not do that. Let’s get really good at running safe, secure, privacy-preserving services.

    When Glenn gave his opening remarks, he joked about not wanting to work with enterprises because of there being too much red tape. There is truth to this, a regulated
    environment can easily be stifling to work in. Especially true if you start from the perspective of the rules rather than the perspective of the end goal, safeguarding a
    vulnerable person. I think about it like version control: if you’ve been in the industry for a while, you remember there was a time when a lot of work was done without
    version control, because version control was “too much overhead, too stifling”. Then networked version control came along, then distributed version control, then
    collaboration tools like github. Now NOT using version control will slow your project down. We’re overdue for a similar transformation when it comes to running securely
    and privately. The more stories we can share about how we are tackling these challenges, the faster the transformation will be from painful, to highly performant.

    View Slide

  10. Tactics
    Don't worry about whether the rules apply to you (because they tend to start applying to you without much warning). Just do a good job.

    But what are the rules?

    What is a good job?

    What doesn’t work: A checklist-based approach to compliance is not sufficient. https everywhere, encryption at rest, etc are good things but not sufficient to count as a
    “good job”. Instead, we need to think about the larger goal - cultivate empathy and solve for the real goal.

    View Slide

  11. #1 Empathy
    Just as we develop fluency with organizational patterns, group behaviors, systems thinking and human error, develop fluency with security and privacy, and understand
    the connection to real people.

    First: What information needs to be safeguarded: personal info.

    Second: Who is supposed to be safeguarded? Techniques from UX design serve you well here, personas or roles. A nurse, a patient, a CEO, a sysadmin.

    With the superpower of empathy, we can develop controls that are BOTH more effective and cheaper than what the regulations steer you towards.

    Concrete application: illustrate with story about "Who's seen my stuff" feature.

    For a few years I worked on CommonGround, which is a web app that patients use together with their psychiatrist and care team. It keeps records of each visit, how
    things are going, how symptoms are, how self-care is going, etc.

    HIPAA privacy rules require covered entities to tell patients, on request, who their private information has been disclosed to. This is the sort of requirement that often gets
    muddled up in all kinds of technical excuse-making and finger pointing, analyzing which parties the law applies to, which kinds of disclosures have to be told, what kind
    of exceptions are allowed for psychiatric notes. You can wrap this all up inside Terms of Service, pay a lawyer to tell you that everything is “compliant”.

    Why does this rule exist? Whats the risk, and how does it show up inside this particular system? Whats the intention? Who is being protected? How can the system work
    without introducing lots of extra audit and log review? Who audits the auditors?

    Clinical staff, with passwords, often have access to a much larger group of patients than they normally work with, for emergency coverage. Same for doctors,
    prescribers, etc. Enforcement typically happens when there is a complaint. How can we detect small scale abuse of power? When 100K records are stolen, that gets

    View Slide

  12. #2 Read the rules
    Given knowledge of and empathy for the people who are entrusting their data to you, the MOST liberating step you can take is to actually read the rules. This is coming
    out of the cave and getting a richer understanding of the world. It is amazing how many people never do this. It’s amazing how many policies or ways of doing things are
    etched in stone because of 3rd hand corrupted misinformation about the regulations. Someone was teasing me about consulting work and saying that it’s basically
    reading-as-a-service. It’s easy to get bored/lost when reading regulations, so start a reading group, and write down a document with a simple mapping of each rule to
    specific policies and practices in your organization.

    Do you need the data at all? Understand that encrypting passwords is best practice, but what about everything else on the User model?

    Can the person decide to disclose the data? YES

    How do we protect the customer against abuse of power by people with keys?

    View Slide

  13. #3 Tell more stories
    How can I gain experience and participate in stories when I don’t directly work on a product with specific security goals?

    Voluntarily adopt some requirements. Don’t say “hey I want to voluntarily spend money on regulations that don’t apply to us”. Instead, do a risk assessment over lunch.
    Pick the thing that is highest risk, and start chipping away at making it better. Tell stories! If you are writing tools for others, include security and privacy considerations in
    your docs and tutorials.

    Bad: configuration management example that glosses over secrets management

    Bad: code example from a database vendor or language advocate that glosses over encrypted connections

    Bad: web frameworks (and web service frameworks) that default to mutate-in-place data stores and incredibly difficult audit trails

    Bad: multi-factor authentication solutions that completely ignore the most vulnerable people in our society (poor, chronically ill, etc)

    All boils down sharing more stories we can share about how to effectively secure applications and data

    Good: Catalyze.io for publishing policies and training

    Good: Ansible having Vault, Chef having encrypted data bags

    Good: Google offering HIPAA BAA contracts at no extra charge!

    Good: AWS offering Key Management System at a fraction of cost of HSM

    Good: Dustin Collins Conjur blog about Secrets and Source Control Maturity Model, pluggable tools

    Future dreams: Secure Multiparty Computation, Homomorphic encryption such as https://sharemind.cyber.ee/

    View Slide

  14. Questions?
    Want help?
    Get in touch:
    [email protected]
    @sstatik
    https://speakerdeck.com/statik

    View Slide