Defining Fast: The Hardest Problem in Performance Engineering

Defining Fast: The Hardest Problem in Performance Engineering

We all want fast sites, but what is fast? What is performant? We may know it when we see; yet quantifying and communicating about web performance effectively is still a challenge. In this talk, we will discuss our ever-evolving set of standards for what comprises a fast site. With special attention to the problems that ads and analytics present for publishers, we will discuss how antiquated notions of web performance are a ripe environment for abuses by 3rd party code. Finally, we will discuss techniques for improving performance monitoring as a tool for institutional change.

980df66b142b2a067b3f8b67b04352de?s=128

Zack Tollman

August 08, 2019
Tweet

Transcript

  1. Defining Fast The Hardest Problem in Performance Engineering Zack Tollman

    | Condé Nast
  2. Menu Master

  3. None
  4. None
  5. None
  6. None
  7. Time to Sammich

  8. None
  9. None
  10. None
  11. None
  12. The Load Event

  13. None
  14. None
  15. None
  16. Golden Age of Performance Metrics

  17. Too Many Metrics

  18. JS Parse/Compile Time Load Time Start Render First Contentful Paint

    MS First Paint Total Byte Weight Time to Interactive First CPU Idle Hero Element Load Time DOM Content Loaded First Paint Total Requests Perceptual Speed Index First Meaningful Paint Speed Index
  19. Tools Are Fantastic

  20. What Should You Care About?

  21. Does the Page Appear to be Loading?

  22. Server Timing API - Time to First Byte Start Render

    - First Paint First Contentful Paint
  23. Can I see meaningful content?

  24. First Meaningful Paint Speed Index Element Timing API

  25. Can I click on or scroll the page?

  26. First CPU Idle First Input Delay Rage Clicks

  27. Does the page continue to be usable?

  28. Time to Interactive Long Tasks User Timing API

  29. None
  30. Philip Walton User-centric Performance Metrics https://developers.google.com/web/fundamentals/performance/user-centric-performance-metrics

  31. Metrics Picked Now What?

  32. Measure The Metrics

  33. Synthetic And Real User Monitoring

  34. Location

  35. Network

  36. Device

  37. Browser

  38. Test Frequency

  39. RUM: Real User Monitoring

  40. Variance is Provided

  41. Instrumentation Challenges

  42. Test with Synthetic

  43. Verify with RUM

  44. Data

  45. mean median average percentiles standard deviation

  46. Consider Your Audience

  47. Report Raw Values

  48. None
  49. Competitor Comparisons

  50. Percent Change

  51. Business metric correlations

  52. Defining Fast Defines Your Performance Culture

  53. Don't Let Others Define Your Culture

  54. speakerdeck.com/tollmanz