What am I going to talk about? Car/Telematics Attack Surfaces ECU, In-Car Networks Car Attack Surface Infrastructure Attack Surface O ensive ’Lenses’ Systems, Assumptions & Constraints Security Composition Adversarial Dynamics Systemic Issues Incentive Structures IC Some proposed implications, directions, ‚ Figure: Telematics Toyota Prius 2010: Qualcomm chip, 3G/CDMA connection. Researchers previously remotely exploited a telematics unit without user interaction. Picture from [VM14]
O ensive Mindset When Someone Talks about ‘Security’ Parameters Matter What does make the system secure mean? Defender has to decide What part to protect, against whom, against what, for how long, at what costs, risks, and using which methods? Threat/Attacker capabilities model often only implied, neglected No One-Size Fits All Answers Mission and Adversary model dependent. Trade-o s inevitable Keywords for today Assumptions Attacks vs Errors, Trust [Bil10] Incentive structures Easy: Mechanism design [GNG08] ,“skin in the game” Compositional Security Unlikely (for real systems): LANGSEC [LSS11] Systemic Security “Rise of Machines” [Joh13]
Systems, Subsystems and nth Order Attacks [Bil09] Objective Induce Instabilities in mission-sustaining ancillary systems that ultimately degrade, disable or subvert end system n: Degree of relation 0th order targets the end system, 1st order targets an ancillary system of the end system, 2nd order an ancillary system of the ancillary system etc. Systems Deﬁnition A whole that functions by virtue of interaction between constitutive components. Deﬁned by relationships. Components may be other systems. Key points: Open, isomorphic laws Nature Technical, algorithmic, societal, psychological, ideological, economic, biological and ecological Examples Resource allocation / throughput / stability control, manufacturing, visualization environments, social welfare systems, voting systems, data / goods / energy generation/ transmission/ distribution, reputation management, entropy externalization, business models and economic systems
Systems, Attacks and Assumption Violation Assumptions Fundamentally, attacks work because they violate assumptions Finite (i.e real life engineered or evolved) systems incorporate implicit/explicit assumptions into structure, functionality, language System geared towards ‘expected’, ‘typical’ cases Assumptions reﬂect those ‘designed-for’ cases Intuitive Examples of Attacks and Assumption Violations Man-in-Middle Attacks Identity assumption violated Race Condition Attacks Ordering assumption violated BGP Routing Attacks Trust assumption violated Generative Mechanism and Assumptions Optimization process incorporating tradeo s between objective functions and resource constraints under uncertainty Assumptions implicit/explicit in optimization formulations
Optimization Process: Highly Optimized Tolerance HOT Background Generative ﬁrst-principles approach proposed to account for power laws P(m) s mαe− m kc in natural/engineered systems [CSN07, CD00] Optimization model incorporates tradeo s between objective functions and resource constraints in probabilistic environments Used Forest, internet tra c, power, immune systems, computer security (me) Pertinent Trait Robust towards common perturbations, but fragile towards rare events Inducing ‘rare events’ in ancillary systems goal of nth order attack ‘Connected Car’ composed of nested, embedded, embedding ancillary systems Probability, Loss, Resource Optimization [MCD05] min J (1) subject to ri ≤ R (2) J = pi li (3) li = f (ri ) (4) 1 ≤ i ≤ M (5) M events (Eq. 5) occurring iid with probability pi incurring loss li (3) Sum-product is objective func to be minimized (1) Resources ri are hedged against losses li , with normalizing f (ri ) = − log ri (4), subject to resource bounds R (2).
LANGSEC: Parsers, Recognizers Figure: Every piece of software that takes inputs contains a de facto recognizer for accepting valid or expected inputs and rejecting invalid or malicious ones. This recognizer code is often ad hoc, spread throughout the program, and interspersed with processing logic (a “shotgun parser”). This lends the processing logic to exploitation and programmers to false assumptions of data safety [SB14]. Picture from [SBH13]
LANGSEC: Compositional Security Secure Composition Problem Composition What can you say about composition of modules A and B? Parsers, Language Classes & Power Formal Input Veriﬁcation Input to parser constitutes valid expression in input-handler’s protocol Secure Composition Prove computational equivalence of input-handling routines, i.e. do two grammars produce exactly the same language? (if not, in extremis birth of ’weird machines’) Requirement Equivalence undecidable for complex protocols - starting from language classes that require Non-Deterministic PDA to recognize input language Way Forward: Minimum Power Principle to Reduce Insecurity of Composition 1 Parser must not provide more than the minimal computational strength necessary to interpret the protocol it is intended to parse 2 Protocols should be designed to require the computationally weakest parser necessary to achieve the intended operation DECIDABLE For regular + deterministic context-free grammars “LangSec” (Nov 2011) IMHO, most fundamental intuition in computer security since Thompson (1984) “Trusting Trust” [LSS11].
Adversarial Dynamics Background Data US border security, computer vulnerability databases, o ensive & defensive coevolution of worms (Conﬁcker) Modeled as players in adversarial situation Findings Performance metrics oscillate over time No asymptotic convergence not monotonic Classic Game Theory not useful, no good ﬁt Claim In realistic (adversarial) games, players do not compute Nash Equilibria over strategy sets Use myopically perceived best responses at each time step Why? Not a stationary environment! Ongoing sequences of countermoves, deception and strategic adaptation Figure: Non-adversarial: Tra c deaths Figure: Adversarial: Vulns / Exploits
Collective Behavior of Interacting Agents Collective Behavior of Interacting Agents Beginnings Bell Labs ‘Core War’ 1960 Conway ‘Game of Life’ 1970s Yesterday Flash Crash 2010: Billions USD evaporated in fraction of second Today 1000s of mini-Flash Crashes every week. HFT shenanigans & collusion schemes ﬁnally being investigated by NY AG “Rise of the Machines” [Joh13] Phenomenological ‘signatures’ of automated black-box algorithmic trading All-machine time regime characterized by frequent ‘black swan’ events with ultrafast durations Collective behavior unpredictable No useful security guarantees anent dynamics possible Figure: HFT “Painting the Tape” Illegal practice of creating ﬁctitious activity in a stock: 70k+ meaningless bids / o ers blasted in 47 seconds. Picture from Nanex
Big Data Quandary: High Dim & Adversarial Space Figure: Two player adversarial non-zero sum game with reinforcement learning strategies. α is memory (0 ≈ all steps , 1 ≈ no memory). Γ is deviation from zero sum game (-1 ≈ zero-sum, 0 ≈ uncorrelated payo s, 1 ≈ payo s identical. β is intensity of choice (0 ≈ all moves equally likely, large ≈ some preferential moves). α ≈ 0 corresponds to replicator dynamics (previous slides).
Ramiﬁcations for IC R & D Repurpose In-Car Vulnerabilities Defensive Disable tracking (On-star, TPMS, etc) through ODB-II plug-in (⌅ idea) O ensive Bind/intercept cell signal of occupants via RF circuitry in embedded systems (⌅ idea) Control Sensor Networks O ensive Destabilize/degrade tra c sensor system (Cesar Cerrudo’s DefCon 2014) Defensive Stabilization via signals ‘nudging’ back to stable state [Bil09] Observe Adversarial Metrics Performance Oscillations modeled by replicator equations. Typically 3rd order, non-linear, analytically di cult Inverse estimating RE params from observations of behavior tractable O ensive/Defensive Infer actual game being played, unfolding: Players motives, costs and move options.
R & D funding as Shì ‚ ‚ Propensity of Things [JL95] Characteristics Formal, dynamic, strategic; fusion of form and momentum; exploiting achieved position to maximum e ect Metaphors life of brush on ﬂuid line, potential of womb Reality perceived as a particular arrangement of things to be relied upon and worked to one’s advantage Investments as ‚ Drivers & Incentive Structures to evolve the “game-creating game” [Mechanism Design [GNG08] (Nobel Econ 2007) ] Game-creating Game Meta-game that drives co-evolution between attacker and defender towards position favorable to defense [Ant10] Cautionary Tale Conﬁcker A-E: Ad-hoc defensive measures (no meta-game consideration) that ultimately resulted in a net worse defense position [BMC13]
Thank you How Scientists Relax Infrared spectroscopy on a vexing problem of our times: Truly comparing apples and oranges. Thank You Thank you for your time and the consideration of these ideas. I appreciate the invitation to speak at ⌅⌅ in lovely Virginia ¨ Figure: A spectrographic analysis of ground, desiccated samples of a Granny Smith apple and a Sunkist navel orange. Picture from [San95]
References I Gary Anthes, Mechanism design meets computer science., Commun. ACM 53 (2010), no. 8, 11–13. Daniel Bilar, On nth order attacks, The virtual battleﬁeld : Perspectives on cyber warfare (Christian Czosseck and Kenneth Geers, eds.), IOS Press, 2009, pp. 262–281. , Degradation and subversion through subsystem attacks, IEEE Security & Privacy 8 (2010), no. 4, 70–73. D. Bilar, J. Murphy, and G. Cybenko, Adversarial dynamics: Conﬁcker case study, Moving Target Defenses (S. Jajodia, ed.), vol. II, Springer, 2013, pp. 41–71. Jean Carlson and John Doyle, Highly Optimized Tolerance: Robustness and Design in Complex Systems, Physical Review Letters 84 (2000), no. 11, 2529+. Cesar Cerrudo, Hacking us tra c control systems, DefCon, vol. 22, 2014. Aaron Clauset, Cosma R. Shalizi, and Mark Newman, Power-Law Distributions in Empirical Data, SIAM Reviews (2007).
References II Dinesh Garg, Y Narahari, and Sujit Gujar, Foundations of mechanism design: A tutorial part 1-key concepts and classical results, Sadhana (Academy Proceedings in Engineering Sciences), vol. 33, Indian Academy of Sciences, 2008, pp. 83–130. François Jullien and Janet Lloyd, The propensity of things: Toward a history of e cacy in china, Zone Books New York, 1995. Neil Johnson, Abrupt rise of new machine ecology beyond human response time, Nature Science Reports 3 (2013). Sergey Bratus Len Sassaman, Meredith L. Patterson and Anna Shubina, The halting problems of network stack insecurity, ;login 36 (2011), no. 6. Alexis C. Madrigal, 9 2014. Lisa Manning, Jean Carlson, and John Doyle, Highly Optimized Tolerance and Power Laws in Dense and Sparse Resource Regimes, Physical Review E 72 (2005), no. 1, 16108+. Scott Sandford, Apples and oranges: a comparison, Annals of Improbable Research 1 (1995), no. 3. Felix Lindner Sergey Bratus, Information security war room, Usenix, 2014.
References III Meredith Patterson Sergey Bratus and Dan Hirsch, From “shotgun parsers” to more secure stacks, ShmooCon, 2013. Chris Valasek and Charlie Miller, A survey of remote automotive attack surfaces, BlackHat US, 2014.