Slide 1

Slide 1 text

Security Metrics Rehab Breaking Free from Top ‘X’ Lists, Cultivating Organic Metrics, & Realizing Operational Risk Management April 11, 2014

Slide 2

Slide 2 text

About Me 2   Author of P.A.S.T.A threat modeling methodology (risk centric) (Wiley Life Sciences)   Nearly 20 years of IT/Security experience (Developer, Security Arch, Dir of RM, CISO roles)   Founder of VerSprite, global consulting security firm based in Atlanta   OWASP ATL Chapter/ Project Leader for past 6 years   VerSprite works across Retail, Hospitality, Financial, Gov’t, Higher Ed, Healthcare, Info Svc   Co-organizer to BSides Atlanta grassroots security conference   B.S from Cornell in International Finance   Contact Info:   tonyuv@versprite.com   @t0nyuv / @VerSprite

Slide 3

Slide 3 text

About This Talk 3   Counterculture presentation on metrics, governance, and risk   Depict pros/ cons around existing metrics/ frameworks in public domain   Introduce seed of thought around building organic security metrics

Slide 4

Slide 4 text

Consider the Following 4 If you had to only pick one…   Option A: Fully developed security controls framework w/ supporting metrics based upon leading industry lists and control frameworks   Option B: Fully developed risk framework where inherent and residual values are quantifiable, supported, and tied to business impact scenarios

Slide 5

Slide 5 text

METRICS VS “BSETRICS”

Slide 6

Slide 6 text

Separating Fact from Fiction Metrics   Objective focused   Built from ‘What Do I Need’ (e.g. – goal of providing evidence to effective technology/process management)   Data source is dependable & vast   Metrics should have a reliable data source that augments over time   Outliers are factored out   Support clearly defined IT/ Biz goals “BSetrics”   Metrics that ‘feel/look good’ (e.g. – closed risk issues)   Built from ‘What Do I have’ (e.g. – tool begins to shape metrics discussion   Based upon “industry standard”   Keeping Up w/ the Jones’ Metrics   Building metrics to manage perception   Data set is limited (e.g. – time, breadth, pre-fixed)   Outliers are not factored out 6

Slide 7

Slide 7 text

Bad Metrics 7   Issues remediated   Unanswered: Tested/ not tested   Unanswered: Was issue resolved, closed, transferred   Unanswered: Is this ‘issue’ important   # High Vulnerabilities Closed   Deemed ‘High’ by whom?   No context (High risk to a Low value asset)   # of Code Imperfections to a Top X List (SAST Scan)   Top X List begins to drive your risk perception devoid of anything else   Cultivating responses, remediation, and reports solely on top X items   # of Pen Tests | # WebAppScans Conducted   Doesn’t factor in automated or poorly conducted testing

Slide 8

Slide 8 text

STATUS QUO SECURITY METRICS

Slide 9

Slide 9 text

Process Metrics   Is a SDL Process used? Are security gates enforced?   Secure application development standards and testing criteria?   Security status of a new application at delivery (e.g., % compliance with organizational security standards and application system requirements).   Existence of developer support website (FAQ's, Code Fixes, lessons learned, etc.)?   % of developers trained, using organizational security best practice technology, architecture and processes Management Metrics   Management Metrics   % of applications rated “business-critical” that have been tested.   % of applications which business partners, clients, regulators require be “certified”.   Average time to correct vulnerabilities (trending).   % of flaws by lifecycle phase.   % of applications using centralized security services.   Business impact of critical security incidents. 9 Examples of AppSec Metrics Today

Slide 10

Slide 10 text

AppSec Metrics in Vuln Management   Number and criticality of vulnerabilities found.   Most commonly found vulnerabilities.   Reported defect rates based on security testing (per developer/team, per application)   Root cause of “Vulnerability Recidivism”.   % of code that is re-used from other products/projects*   % of code that is third party (e.g., libraries)*   Results of source code analysis**:   Vulnerability severity by project, by organization   Vulnerabilities by category by project, by organization   Vulnerability +/- over time by project   % of flaws by lifecycle phase (based on when testing occurs) Source: * WebMethods, ** Fortify Software

Slide 11

Slide 11 text

ROOM FOR IMPROVEMENT

Slide 12

Slide 12 text

Forrester Survey: “What are your top three drivers for measuring information security?” Source: “Measuring Information Security Through Metrics And Reporting”, Forrester Research, Inc., May 2006” 63% 11% 23% 26% 37% 51% Manage risk Report progress to business Better stewardship Loss of reputation Regulations Justification for security spending Report progress to business Better stewardship Base: 40 CISOs and senior security managers

Slide 13

Slide 13 text

Good Metrics – Align w/ Maturity Model Metrics  ma+er  most  when  they   have  direct  or  indirect  relevance   to  opera5onal/  strategic  goals   Align  to   Biz/  IT   Goals   Directly  or  indirectly,  categories   to  be  measured  need  to  map  to   key  indicators  that  ma+er  in  IT   Ops,  Sales,  Finance   Relate  to   Business   Processes   Good  start  is  to  map  metric  areas   to  key  processes  sustained  by  a   BIA   Map  to  a   Business   Impact     Start simple   Forget what everyone else is doing – for now   Perform an internal PoC with LOBs/ BUs   Grow base of coverage over time   Mature metrics by benchmarking against industry reports/ analysis 13

Slide 14

Slide 14 text

Opportunities for Metrics - Secure Development Life Cycle (SDL) 14 Secure questions during interviews Concept Designs Complete Test plans Complete Code Complete Deploy Post Deployment Threat analysis Security Review Team member training Data mutation & Least Priv Tests Review old defects Check-ins checked Secure coding guidelines Use tools Security push/audit = on-going Learn & Refine External review Source: Microsoft Software assurance activities conducted at each lifecycle phase

Slide 15

Slide 15 text

Organizing Metric Types Process Metrics Information about the processes themselves. Evidence of maturity. Vulnerability Metrics Metrics about application vulnerabilities themselves Management Metrics specifically designed for senior management Examples !  Secure coding standards in use !  Avg. time to correct critical vulnerabilities Examples !  By vulnerability type !  By occurrence within a software development life cycle phase Examples !  % of applications that are currently security “certified” and accepted by business partners !  Trending: critical unresolved, accepted risks

Slide 16

Slide 16 text

Our Security Metric Challenge “A major difference between a "well developed" science such as physics and some of the less "well- developed" sciences such as psychology or sociology is the degree to which things are measured.” Source: Fred S. Roberts, ROBE79 “Give information risk management the quantitative rigor of financial information management.” Source: CRA/NSF, 10 Year Agenda for Information Security Research, cited by Dr. Dan Geer

Slide 17

Slide 17 text

BREAKING FREE FROM TOP ‘X’ LISTS

Slide 18

Slide 18 text

Let’s Rethink Security Lists Pros   Great content from various sources: OWASP Top Ten, SANS 20 Critical Security Controls, MITRE CWE Top 25, WASC TC v2, OWASP Top 10 - Mobile   Provide a benchmark for testing | measurement   Brings broader industry perspective   Better suited for more mature programs where benchmarking is timely Cons   This defines an AppSec’s program baseline   Used as ground floor level of metrics   Tempts programs to look outwardly vs. inwardly   Doesn’t foster for Good Metrics to take root   Tools don’t make quitting this trend easy (pre-defined profiles)   Not a real basis for threat or risk analysis

Slide 19

Slide 19 text

How Do Lists Break Us Free from This Cycle?

Slide 20

Slide 20 text

METRICS & LISTS – TIMING IS EVERYTHING

Slide 21

Slide 21 text

OWASP OpenSAMM Project   Evaluate an organization’s existing software security practices   Build a balanced software security assurance program in well-defined iterations   Demonstrate concrete improvements to a security assurance program   Define and measure security-related activities throughout an organization http://www.opensamm.org   Dedicated to defining, improving, and testing the SAMM framework   Always vendor-neutral, but lots of industry participation   Open and community driven   Targeting new releases every 6-12 months   Change management process

Slide 22

Slide 22 text

SAMM in a nutshell   Evaluate an organization’s existing software security practices   Build a balanced software security assurance program in well- defined iterations   Demonstrate concrete improvements to a security assurance program   Define and measure security-related activities throughout an organization

Slide 23

Slide 23 text

OWASP OpenSAMM (Software Assurance Maturity Model)   Look inward   Start with the core activities tied to SDLC practices   Named generically, but should resonate with any developer or manager

Slide 24

Slide 24 text

Leveraging Lists at the Right Maturity Level   Measure what you need across a framework’s (OpenSAMM) area   Identify ‘indicators’ that support business/ product goals & objectives   Apply use of lists for benchmarking as maturity level rise

Slide 25

Slide 25 text

Develop ‘Organic’ Security Metrics Reasons   Supports contextual analysis based upon internal operations   Top down approach to regressing to security metrics that matter   Will substantiate security initiatives across non-InfoSec areas Baking Organic Metrics Organiza5onal  Objec5ves   Opera5onal   Processes     Suppor5ng   Technology  &   Infrastructure   BU/  LoB  Objec5ves   Revenue   Growth   • Reputa5onal  Loss   • Non-­‐Compliance   Cost  Reduc5on   • Fines  &  Penal5es   Product/  Service   Objec5ves   Product  Innova5on   • IP  Security   • Insider  Threats   • Incident  Handling/  Response   Efficient  Service  Delivery   • Con5nuity   • Data  Integrity  

Slide 26

Slide 26 text

Revisiting Lists   Build your processes first   Design metrics mapped to activities for those processes   Develop scorecards that report on organic security metrics that relate to operational, financial areas   Bake-in industry ‘lists’ in order to reflect more advanced quantitative analysis (Level 4)

Slide 27

Slide 27 text

Creating Scorecards   Gap analysis   Capturing scores from detailed assessments versus expected performance levels   Demonstrating improvement   Capturing scores from before and after an iteration of assurance program build-out   Ongoing measurement   Capturing scores over consistent time frames for an assurance program that is already in place

Slide 28

Slide 28 text

THANK YOU!