Rehabilitating CISOs to better understand how to employ security metrics that matter across an enterprise. Presentation delivered during Hacker Halted and EC-Council event in Atlanta 2014.
(risk centric) (Wiley Life Sciences) Nearly 20 years of IT/Security experience (Developer, Security Arch, Dir of RM, CISO roles) Founder of VerSprite, global consulting security firm based in Atlanta OWASP ATL Chapter/ Project Leader for past 6 years VerSprite works across Retail, Hospitality, Financial, Gov’t, Higher Ed, Healthcare, Info Svc Co-organizer to BSides Atlanta grassroots security conference B.S from Cornell in International Finance Contact Info: [email protected] @t0nyuv / @VerSprite
one… Option A: Fully developed security controls framework w/ supporting metrics based upon leading industry lists and control frameworks Option B: Fully developed risk framework where inherent and residual values are quantifiable, supported, and tied to business impact scenarios
from ‘What Do I Need’ (e.g. – goal of providing evidence to effective technology/process management) Data source is dependable & vast Metrics should have a reliable data source that augments over time Outliers are factored out Support clearly defined IT/ Biz goals “BSetrics” Metrics that ‘feel/look good’ (e.g. – closed risk issues) Built from ‘What Do I have’ (e.g. – tool begins to shape metrics discussion Based upon “industry standard” Keeping Up w/ the Jones’ Metrics Building metrics to manage perception Data set is limited (e.g. – time, breadth, pre-fixed) Outliers are not factored out 6
tested Unanswered: Was issue resolved, closed, transferred Unanswered: Is this ‘issue’ important # High Vulnerabilities Closed Deemed ‘High’ by whom? No context (High risk to a Low value asset) # of Code Imperfections to a Top X List (SAST Scan) Top X List begins to drive your risk perception devoid of anything else Cultivating responses, remediation, and reports solely on top X items # of Pen Tests | # WebAppScans Conducted Doesn’t factor in automated or poorly conducted testing
gates enforced? Secure application development standards and testing criteria? Security status of a new application at delivery (e.g., % compliance with organizational security standards and application system requirements). Existence of developer support website (FAQ's, Code Fixes, lessons learned, etc.)? % of developers trained, using organizational security best practice technology, architecture and processes Management Metrics Management Metrics % of applications rated “business-critical” that have been tested. % of applications which business partners, clients, regulators require be “certified”. Average time to correct vulnerabilities (trending). % of flaws by lifecycle phase. % of applications using centralized security services. Business impact of critical security incidents. 9 Examples of AppSec Metrics Today
vulnerabilities found. Most commonly found vulnerabilities. Reported defect rates based on security testing (per developer/team, per application) Root cause of “Vulnerability Recidivism”. % of code that is re-used from other products/projects* % of code that is third party (e.g., libraries)* Results of source code analysis**: Vulnerability severity by project, by organization Vulnerabilities by category by project, by organization Vulnerability +/- over time by project % of flaws by lifecycle phase (based on when testing occurs) Source: * WebMethods, ** Fortify Software
information security?” Source: “Measuring Information Security Through Metrics And Reporting”, Forrester Research, Inc., May 2006” 63% 11% 23% 26% 37% 51% Manage risk Report progress to business Better stewardship Loss of reputation Regulations Justification for security spending Report progress to business Better stewardship Base: 40 CISOs and senior security managers
when they have direct or indirect relevance to opera5onal/ strategic goals Align to Biz/ IT Goals Directly or indirectly, categories to be measured need to map to key indicators that ma+er in IT Ops, Sales, Finance Relate to Business Processes Good start is to map metric areas to key processes sustained by a BIA Map to a Business Impact Start simple Forget what everyone else is doing – for now Perform an internal PoC with LOBs/ BUs Grow base of coverage over time Mature metrics by benchmarking against industry reports/ analysis 13
Secure questions during interviews Concept Designs Complete Test plans Complete Code Complete Deploy Post Deployment Threat analysis Security Review Team member training Data mutation & Least Priv Tests Review old defects Check-ins checked Secure coding guidelines Use tools Security push/audit = on-going Learn & Refine External review Source: Microsoft Software assurance activities conducted at each lifecycle phase
Evidence of maturity. Vulnerability Metrics Metrics about application vulnerabilities themselves Management Metrics specifically designed for senior management Examples ! Secure coding standards in use ! Avg. time to correct critical vulnerabilities Examples ! By vulnerability type ! By occurrence within a software development life cycle phase Examples ! % of applications that are currently security “certified” and accepted by business partners ! Trending: critical unresolved, accepted risks
developed" science such as physics and some of the less "well- developed" sciences such as psychology or sociology is the degree to which things are measured.” Source: Fred S. Roberts, ROBE79 “Give information risk management the quantitative rigor of financial information management.” Source: CRA/NSF, 10 Year Agenda for Information Security Research, cited by Dr. Dan Geer
sources: OWASP Top Ten, SANS 20 Critical Security Controls, MITRE CWE Top 25, WASC TC v2, OWASP Top 10 - Mobile Provide a benchmark for testing | measurement Brings broader industry perspective Better suited for more mature programs where benchmarking is timely Cons This defines an AppSec’s program baseline Used as ground floor level of metrics Tempts programs to look outwardly vs. inwardly Doesn’t foster for Good Metrics to take root Tools don’t make quitting this trend easy (pre-defined profiles) Not a real basis for threat or risk analysis
practices Build a balanced software security assurance program in well-defined iterations Demonstrate concrete improvements to a security assurance program Define and measure security-related activities throughout an organization http://www.opensamm.org Dedicated to defining, improving, and testing the SAMM framework Always vendor-neutral, but lots of industry participation Open and community driven Targeting new releases every 6-12 months Change management process
security practices Build a balanced software security assurance program in well- defined iterations Demonstrate concrete improvements to a security assurance program Define and measure security-related activities throughout an organization
you need across a framework’s (OpenSAMM) area Identify ‘indicators’ that support business/ product goals & objectives Apply use of lists for benchmarking as maturity level rise
upon internal operations Top down approach to regressing to security metrics that matter Will substantiate security initiatives across non-InfoSec areas Baking Organic Metrics Organiza5onal Objec5ves Opera5onal Processes Suppor5ng Technology & Infrastructure BU/ LoB Objec5ves Revenue Growth • Reputa5onal Loss • Non-‐Compliance Cost Reduc5on • Fines & Penal5es Product/ Service Objec5ves Product Innova5on • IP Security • Insider Threats • Incident Handling/ Response Efficient Service Delivery • Con5nuity • Data Integrity
mapped to activities for those processes Develop scorecards that report on organic security metrics that relate to operational, financial areas Bake-in industry ‘lists’ in order to reflect more advanced quantitative analysis (Level 4)
assessments versus expected performance levels Demonstrating improvement Capturing scores from before and after an iteration of assurance program build-out Ongoing measurement Capturing scores over consistent time frames for an assurance program that is already in place