Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Toorcon 16 (2014) - Time Trial - Racing Towards Practical Timing Attacks

Toorcon 16 (2014) - Time Trial - Racing Towards Practical Timing Attacks

Attacks on software become increasingly sophisticated over time and while the community has a good understanding of many classes of vulnerabilities that are commonly exploited, the practical relevance of side-channel attacks is much less understood.

One common side-channel vulnerability that is present in many web applications today are timing side-channels that allow an attacker to extract information based on different response times. These side-channel vulnerabilities are easily introduced wherever sensitive values such as credentials are compared to an attacker-controlled value before responding to a client. Subtle timing side-channels can also exist when an attacker is able to influence logical branching that leads to different response times. Even though there is basic awareness of timing side-channel attacks in the community, they often go unnoticed or are flagged during code audits without a true understanding of their exploitability in practice.

In this talk, we provide both a tool ‘time trial’ and guidance on the detection and exploitability of timing side-channel vulnerabilities in common web application scenarios. Specifically, the focus of our presentation is on remote timing attacks, which are performed over a LAN, in a cloud environment, or on the Internet. To illustrate this, we first present detailed empirical timing results that demonstrate which timing differences can be distinguished remotely using our tool. Second, we compare our results with timing differences that are typically encountered in modern web frameworks and servers for both comparison-based and branching-based vulnerabilities. The discussed attack scenarios include database queries, message authentication codes, API keys, OAuth tokens, login functions, and cryptographic implementations. We cover scenarios where these attacks are practical, and also present negative results that show the limitations of these attacks against modern systems.

Our presentation has significance for a wide spectrum of the conference audience. Attendees in defensive security roles will gain a better understanding of the threat timing side-channel vulnerabilities pose and, based on the demonstrated attacks, will be better able to evaluate the severity and impact of a successful side-channel attack. Attendees in a penetration testing role will learn how to distinguish theoretical timing side-channels from legitimately exploitable flaws by using our tool ‘time trial’ and understand the challenges in performing these attacks in practice. Finally, attendees focused on research implications will receive a comprehensive update on the state-of-the-art in exploiting timing attacks in practice.

Daniel A. Mayer

October 25, 2014
Tweet

More Decks by Daniel A. Mayer

Other Decks in Technology

Transcript

  1. Joel Sandin Time Trial Racing Towards Practical Remote Timing Attacks

    @DanlAMayer http://cysec.org [email protected] Daniel A. Mayer October 25, 2014 - San Diego, CA
  2. Daniel A. Mayer and Joel Sandin » Time Trial Who

    we are… ‣ Daniel A. Mayer • Senior consultant with Matasano Security. • Ph.D. in Computer Science (Security and Privacy). ‣ Joel Sandin • Appsec consultant with Matasano ! ‣ Matasano Security • Application Security Consultancy. • Offices in New York, Chicago, Sunnyvale. • Part of 2
  3. Daniel A. Mayer and Joel Sandin » Time Trial Agenda

    1. Timing Side-Channels 2. Remote Timing Attacks 3. Our Tool: Time Trial 4. Timing Attacks in Practice 5. Conclusion 3
  4. Daniel A. Mayer and Joel Sandin » Time Trial Examples

    of Side-Channels ‣ Power consumption ‣ RF emissions ‣ Sound ‣ Processing Time ! ‣ Really, anything that can be measured and is related to a secret. 6
  5. Daniel A. Mayer and Joel Sandin » Time Trial ‣

    Many vulnerabilities well understood • XSS, CSRF, SQL injection • Developers becoming more aware • Frameworks: Harder to introduce bugs ! ‣ Side-channels: Less so • Easy to introduce using “innocent” operators • Hard to observe and test for • Have to go out of one’s way to prevent them “Regular Vulns” vs. Side-Channels 7
  6. Daniel A. Mayer and Joel Sandin » Time Trial Timing

    Side-Channels ‣ Response time differs depending on computation ! ‣ Attacker can learn information about system • sensitive credentials • internal system state ! ‣ Easy to introduce 8 ‣ Exploitable remotely?
  7. Daniel A. Mayer and Joel Sandin » Time Trial Timing

    Side-Channels 8 ‣ Exploitable remotely?
  8. Daniel A. Mayer and Joel Sandin » Time Trial post

    '/login' do! if not valid_user?(params[:user]) ! "Username or Password incorrect"! else! if verify_password(params[:user], params[:password])! "Access granted"! else! "Username or Password incorrect"! end! end! end! Valid user wrong password Basic Timing Side-Channel 9 Invalid user
  9. Daniel A. Mayer and Joel Sandin » Time Trial Timing

    Attacks ‣ Reason about system based on response time 10 Attacker Target POST /login with username, password Username or Password incorrect valid_user? verify_password Username or Password incorrect Access granted Start t_0 t_1 Invalid user Valid user
  10. Daniel A. Mayer and Joel Sandin » Time Trial Prior

    Work! ‣ Rich history of timing attacks in crypto, e.g. • Kocher, 1996
 Timing Attacks on Implementations of Diffie-Hellman, RSA, DSS, and Other Systems • Brumley and Boneh, 2005
 Remote Timing Attacks are Practical ! ‣ Excellent empirical studies, e.g. • Crosby et al., 2009
 Opportunities and Limits of Remote Timing Attacks • Lawson and Nelson, 2010
 Exploiting Timing Attacks In Widespread Systems 11
  11. Daniel A. Mayer and Joel Sandin » Time Trial ‣

    Local attacks • Precise measurement of execution time • Can minimize external influences Local vs. Remote - Challenges 13 + Signal Jitter Measured ‣ Remote attacks • Propagation time added to the measurement. • Network delays add jitter.
  12. Daniel A. Mayer and Joel Sandin » Time Trial Real

    Jitter ‣ Additional Caveat: • Distribution isn’t Gaussian, hard to model • Skewed, multiple modes 14
  13. Daniel A. Mayer and Joel Sandin » Time Trial ‣

    Measure a large number of response times ‣ Measurement must be related to processing time! ‣ Median and minimum not good indicators Statistical Methods 15 Based on Crosby et al. 0 2 4 6 9 10 12 94 98 102 106 110 114 Server Processing Time Measured Response Time 1 to ~20% Minimum (0%)
  14. Daniel A. Mayer and Joel Sandin » Time Trial ‣

    The Box Test ‣ Compare intervals induced by percentiles ‣ Percentiles to be determined empirically 0.52 0.53 0.54 0.55 0.56 0.57 0.58 0.59 0.6 Time / [ms] 0 20 40 60 80 100 120 140 Frequency 100 ms 105 ms Statistical Methods 16
  15. Daniel A. Mayer and Joel Sandin » Time Trial ‣

    The Box Test ‣ Compare intervals induced by percentiles ‣ Percentiles to be determined empirically 0.52 0.53 0.54 0.55 0.56 0.57 0.58 0.59 0.6 Time / [ms] 0 20 40 60 80 100 120 140 Frequency 100 ms 105 ms 6% Statistical Methods 16
  16. Daniel A. Mayer and Joel Sandin » Time Trial ‣

    The Box Test ‣ Compare intervals induced by percentiles ‣ Percentiles to be determined empirically 0.52 0.53 0.54 0.55 0.56 0.57 0.58 0.59 0.6 Time / [ms] 0 20 40 60 80 100 120 140 Frequency 100 ms 105 ms 8% Statistical Methods 16
  17. Daniel A. Mayer and Joel Sandin » Time Trial ‣

    The Box Test ‣ Compare intervals induced by percentiles ‣ Percentiles to be determined empirically 0.52 0.53 0.54 0.55 0.56 0.57 0.58 0.59 0.6 Time / [ms] 0 20 40 60 80 100 120 140 Frequency 100 ms 105 ms Statistical Methods 16
  18. Daniel A. Mayer and Joel Sandin » Time Trial Why

    a tool for timing attacks? ‣ No way to demonstrate impact ! ! ‣ Separate theoretical issues 
 from exploitable vulnerabilities ! ! ‣ Reframes the debate about practicality of these attacks 18
  19. Daniel A. Mayer and Joel Sandin » Time Trial Time

    Trial ‣ What Time Trial is: • A framework for capturing precise timing • A tool for feasibility analysis • A generator of visual proof-of-concepts ! ‣ What Time Trial is NOT (yet): • A read-to-use exploit framework • An automated attack tool 19
  20. Daniel A. Mayer and Joel Sandin » Time Trial Goals

    and Design ‣ Separate “racer” sensor from analytic front end. • Front end: Python + Qt • Racer: C++ ‣ Schedule trials and analyze results 20 1. Define Trials in the GUI 1 2. Redis-Backed RQ Target 3. Racer Executes 4. Results in Queue 5. View/Analyze Results
  21. Daniel A. Mayer and Joel Sandin » Time Trial How

    to do precise time measurements? 21
  22. Daniel A. Mayer and Joel Sandin » Time Trial How

    to do precise time measurements? 21
  23. Daniel A. Mayer and Joel Sandin » Time Trial How

    to do precise time measurements? 21
  24. Daniel A. Mayer and Joel Sandin » Time Trial Optimizations

    ‣ Use clock_gettime for nanosecond timer • Using MONOTONIC clock ! ‣ Used fixed, reserved CPU core • GRUB_CMDLINE_LINUX_DEFAULT="maxcpus=2 isolcpus=1" • CPU affinity ! ‣ Run with real-time priority ! ‣ Disable frequency scaling 22
  25. Daniel A. Mayer and Joel Sandin » Time Trial Data

    across different networks ‣ Analyzed response time distributions for different networks: • LAN • Internet at large • Cloud environments ! ! ! ‣ In order to exploit: distinguish response times. • Was the response t_0 or t_1 for given input? 25 Attacker Target POST /login with username, password Username or Password incorrect valid_user? verify_password Username or Password incorrect Access granted Start t_0 t_1
  26. Daniel A. Mayer and Joel Sandin » Time Trial Feasibility

    Based on Echo Trials ‣ What timing differences can be distinguished in practice? • Similar to the approach by Crosby et al. 26 Attacker Target “1,000” Start measured time sleep for 1,000 ns “1,000”
  27. Daniel A. Mayer and Joel Sandin » Time Trial Timing

    Resolution: LAN 27 1,000 Repetitions
  28. Daniel A. Mayer and Joel Sandin » Time Trial Timing

    Resolution: LAN 27 1,000 Repetitions
  29. Daniel A. Mayer and Joel Sandin » Time Trial Timing

    Resolution: LAN 27 1,000 Repetitions
  30. Daniel A. Mayer and Joel Sandin » Time Trial Timing

    Resolution: LAN 27 1,000 Repetitions
  31. Daniel A. Mayer and Joel Sandin » Time Trial Timing

    Resolution: LAN 28 1,000 Repetitions 10,000 Repetitions 100,000 Repetitions
  32. Daniel A. Mayer and Joel Sandin » Time Trial Timing

    Resolution: LAN Limit ‣ 100 ns difference clear ‣ < 100 ns inconsistent 29 TKTK 1,000,000
  33. Daniel A. Mayer and Joel Sandin » Time Trial Timing

    Resolution: Loopback ‣ Better than 30 ns 30 100,000
  34. Daniel A. Mayer and Joel Sandin » Time Trial Timing

    Resolution: WAN Limit 31 100,000
  35. Daniel A. Mayer and Joel Sandin » Time Trial Timing

    Resolution: EC2 Limit 32 100,000!
  36. Daniel A. Mayer and Joel Sandin » Time Trial Timing

    Resolution: EC2 Limit 32 100,000!
  37. Daniel A. Mayer and Joel Sandin » Time Trial Overview

    of Results 33 1 ms 1 µs 100 ns < 100 ns Loopback LAN EC2 WAN
  38. Daniel A. Mayer and Joel Sandin » Time Trial Impact

    on Real-world Applications ! ! ! ! ! ! ! 34
  39. Daniel A. Mayer and Joel Sandin » Time Trial String

    comparison ‣ Most string comparison return early • Leaks timing information about which byte differed 36 Attacker Target e9aa Start t_0 Valid Credential: e993 Credential invalid i == cred valid invalid e 9 9 3 e 9 a a e9aa == e993 == == == == t_1 Credential invalid String Comparison Most Programm
  40. Daniel A. Mayer and Joel Sandin » Time Trial String

    comparison ‣ Introduced when attacker-controlled data is compared to a secret ! ‣ Commonly prone to timing attacks: • HMACs (e.g., session state) • Web API keys • OAuth token checks • Middleware authentication ! ‣ Exploitable remotely? 37
  41. Daniel A. Mayer and Joel Sandin » Time Trial String

    Comparison: Conclusions ‣ Most 64-bit OSes compare 8 bytes at a time! • http://rdist.root.org/2010/08/05/optimized-memcmp-leaks-useful- timing-differences/ ! 38
  42. Daniel A. Mayer and Joel Sandin » Time Trial Internet

    of Things ‣ BeagleBone Black: 1 GHz ARM Cortex-A8 • Java benchmarks put it within reach, exit on first byte: 39
  43. Daniel A. Mayer and Joel Sandin » Time Trial Microbenchmarks

    (in nanoseconds) 40 Language Function Lawson 2010* i5-3210M 2.50GHz Cortex-A8 1GHz per byte per word per byte C memcmp 0.719 0.243 1.37 C strcmp - 0.41 4.04 Ruby str == 0.840 0.36 1.75 Python str == 1.400 0.224 1.48 Java String.equals 40.594 7.65 18.91 * Hardware: AMD Athlon X2 2.7 GHz ‣ Resolution < differences of multiple bytes ‣ Remote exploitation highly unlikely in practice!
  44. Daniel A. Mayer and Joel Sandin » Time Trial Branching

    ‣ Different code path based on secret state ! ‣ Timing difference depends on application ! ‣ Which operation performed in each code path? 41 Attacker Target POST /login with username, password Username or Password incorrect valid_user? verify_password Username or Password incorrect Access granted Start t_0 t_1
  45. Daniel A. Mayer and Joel Sandin » Time Trial Branching

    ‣ User enumeration (SHA-256) • (Not a SHA-256 attack!) 42 LAN WAN
  46. Daniel A. Mayer and Joel Sandin » Time Trial Time-Based

    Padding Oracle ‣ AES CBC Padding Oracle ‣ Distinguish • Wrong Padding • Other Processing Error 43 Attacker Target GET /decrypt?data=[encrypted data] an error occurred. decryption_successful? process_data an error occurred. data processed succesfully. Start t_0 t_1
  47. Daniel A. Mayer and Joel Sandin » Time Trial Time-Based

    Padding Oracle ‣ Perform SQLite query when decrypt successful • Actual difference depends on application! 44
  48. Daniel A. Mayer and Joel Sandin » Time Trial DEMO:

    Time-Based CBC Padding Oracle 45
  49. Daniel A. Mayer and Joel Sandin » Time Trial Take

    Away: Microbenchmarks ‣ Computing performance continues to improve • Comparison-based vulnerabilities difficult to exploit. ! ‣ Branching-based often feasible ! ‣ Embedded systems at greater risk • Java on ARM a feasible target • Attacking string-comparison on Arduino realistic. • Paul McMillan’s talk at DEFCON + Ekoparty 2014 46
  50. Daniel A. Mayer and Joel Sandin » Time Trial Preventing

    timing attacks ‣ Ensure sensitive operations take constant time • Analyze for branching side-channels • This is hard! ! ‣ Use constant time comparison functions • See our white paper ! ‣ Best practices • Throttle or lock out misbehaving clients • Monitor for failed requests 47
  51. Daniel A. Mayer and Joel Sandin » Time Trial Future

    Plans ‣ More empirical studies ! ‣ Implement (feasible!) attacks ! ‣ Jitter changes over time • Interleave long and short measurements ! ! Send bug reports, feature / pull requests! 48
  52. Daniel A. Mayer and Joel Sandin » Time Trial Thanks!

    Questions? 49 https://github.com/dmayer/time_trial ‣ Daniel A. Mayer [email protected] @DanlAMayer ‣ Joel Sandin [email protected] http://matasano.com/research/