Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Defining Fast: The Hardest Problem in Performance Engineering

Defining Fast: The Hardest Problem in Performance Engineering

We all want fast sites, but what is fast? What is performant? We may know it when we see; yet quantifying and communicating about web performance effectively is still a challenge. In this talk, we will discuss our ever-evolving set of standards for what comprises a fast site, how to measure it, and what to do once we have the resulting data.


Zack Tollman

November 02, 2019

More Decks by Zack Tollman

Other Decks in Technology


  1. Defining Fast The Hardest Problem in Performance Engineering Zack Tollman

    | Condé Nast
  2. Menu Master

  3. None
  4. None
  5. None
  6. None
  7. Time to Sammie

  8. None
  9. Time to Purchase

  10. None
  11. None
  12. None
  13. 1. Metrics 2. Observing 3. Reporting

  14. The Load Event

  15. None
  16. None
  17. None
  18. Golden Age of Performance Metrics

  19. JS Parse/Compile Time Load Time Start Render First Contentful Paint

    MS First Paint Total Byte Weight Time to Interactive First CPU Idle Hero Element Load Time DOM Content Loaded First Paint Total Requests Perceptual Speed Index First Meaningful Paint Speed Index
  20. What Should You Care About?

  21. Does the Page Appear to be Loading?

  22. Server Timing API - Time to First Byte Start Render

    - First Paint First Contentful Paint
  23. Can I see meaningful content?

  24. First Meaningful Paint Speed Index Element Timing API

  25. Can I click on or scroll the page?

  26. First CPU Idle First Input Delay Rage Clicks

  27. Does the page continue to be usable?

  28. Time to Interactive Long Tasks User Timing API

  29. None
  30. User-centric Performance Metrics https://developers.google.com/web/fundamentals/performance/user-centric-performance-metrics

  31. Metrics Picked Now What?

  32. Measure The Metrics

  33. Synthetic Monitoring

  34. Location Network Device Browser

  35. Test Frequency

  36. None
  37. RUM: Real User Monitoring

  38. Instrumentation Challenges

  39. Variance is Provided

  40. None
  41. Test with Synthetic Verify with RUM

  42. Performance Observed Now What?

  43. mean median percentiles histograms standard deviation

  44. None
  45. Alerting

  46. None
  47. Accessibility Percent Change Competitor Analysis Product Comparisons

  48. Defining Fast Defines Your Performance Culture

  49. speakerdeck.com/tollmanz