Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Meaningful performance metrics

Meaningful performance metrics

If you can't measure it, you can't improve it. But measuring load time is easy, right? "Load time" is an outdated concept, as single page app experiences take over the web. We need better metrics to measure what our users are really feeling.

Joshua Nelson

December 09, 2018
Tweet

More Decks by Joshua Nelson

Other Decks in Programming

Transcript

  1. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics * We make tools for teams *

    Hand up if you have ever used one of these tools * Honestly, even more stuff is out there than is listed here
  2. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics * This is what Jira cloud looks

    like * If you haven’t used it for a while, check it out, we’ve made it even better
  3. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Performance We’re going to talk about performance.

    It’s an something we’re focusing on at Atlassian We’re looking to improve real user experiences
  4. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics * I know the pain of slow

    internet * They did this test when I lived there, in 2009 * The Prime Minister said: “If [the opposition] had their way Australians would be left using carrier pigeons for the future rather than accessing an internationally competitive broadband network” * So, they tested it. They flew a carrier pigeon with a 700 megabyte usb drive from the central west, to Sydney, about 100kms or 60 miles. https://www.itnews.com.au/news/australian-internet-fails-pigeon-test-159232
  5. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics * The pigeon won, in 1:05 *

    They sent a car too, which took 2:10 * The internet dropped out twice and didn’t even make it
  6. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics * Internet speeds suck out there. —

    “Never underestimate the bandwidth of a station wagon full of tapes hurtling down the highway. –Andrew Tanenbaum, 1981”
  7. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Good news! Internet gets faster over time

    * But don’t worry! As technology changes, things get better over time automatically!
  8. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics * Internet speed in the US over

    time, in megabits per second. Going up! https://www.statista.com/statistics/616210/average-internet-connection-speed-in-the-us/
  9. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics * And great! Thanks to Moore’s law,

    CPU speeds have been going up for years! * Sweet, problem solved, right? https://www.statista.com/statistics/616210/average-internet-connection-speed-in-the-us/
  10. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics * Massive survey on the state of

    javascript * Median “onload” time 5.5 seconds or 15.6 seconds on mobile * https://httparchive.org/reports/state-of-javascript?start=earliest&end=latest&view=list
  11. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics 1% increase in capacity → Up to

    1.1% increase in demand * The “induced demand” effect for every 1 percent increase in highway capacity, traffic increases 0.29 to 1.1 percent in the long term (about five years out), and up to 0.68 percent in the short term (one or two years) https://trrjournalonline.trb.org/doi/abs/10.3141/2653-02?journalCode=trr
  12. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics * Same survey shows us that page

    weight is going up too * https://httparchive.org/reports/state-of-javascript?start=earliest&end=latest&view=list
  13. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Business goals One second delay in Bing

    results in a 2.8% drop in revenue. Two second delay results in 4.3% drop. * https://wpostats.com/tags/revenue/
  14. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Business goals Walmart: 2% increase in conversions

    for every 1 second of improvement in load time. Every 100ms improvement also resulted in up to a 1% increase in revenue. * It’s a UX problem * https://wpostats.com/tags/revenue/
  15. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Business goals Google's DoubleClick found that publishers

    whose mobile sites load in 5 seconds earn up to 2x more mobile ad revenue than sites loading in 19 seconds. * It’s a UX problem * https://wpostats.com/tags/revenue/
  16. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics The world “Youtube feather” * Made a

    version of youtube 90% lighter * Opt in to small % of traffic. * But average load time went up??
  17. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics The world Southeast Asia, South America, Africa

    and Siberia. * Actually more traffic from countries with poor connectivity! * These countries would’ve normally taken two minutes to load! * Australia not far off * Actually affects whether people can use it at all * https://wpostats.com/2015/11/11/youtube-feather.html
  18. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics User happiness 0.1s 1s 10s Instant “Flow

    ” Task sw itch * Research has shown that there are three different types of tasks * <0.1s Instant – the system is reacting immediately * <1s Flow – I’m in a state of flow and my concentration is not broken between tasks * <10s Task switch – I’m out of here * https://www.nngroup.com/articles/response-times-3-important-limits/
  19. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics User happiness 0.1s 1s 10s Instant “Flow

    ” Task sw itch ☹ * Between 1s and 10s is a whole lot of emotion too * Users are progressively getting frustrated and want to switch to another task in this period of time
  20. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics User happiness 0.1s 1s 10s Instant “Flow

    ” Task sw itch ☹ Median desktop 5.5s 15.6s Median mobile * Between 1s and 10s is a whole lot of emotion too * Users are progressively getting frustrated and want to switch to another task in this period of time
  21. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Preaching to the choir * I’m sure

    I’m preaching to the choir here, if you’ve come to this talk. * Do you agree with these statistics that I’ve mentioned so far? * Have you already heard it before? * Do you really care?
  22. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics This continues until the inevitable heat death

    of the universe Ever expanding entropy claims all. Chaos reigns supreme We can’t control it. Performance will always regress
  23. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Why measure What to measure ✅ “Good”

    metrics ⚖ Metric choices ⚒ Ways to measure Protect performance Here’s an overview of what I’m going to talk about today
  24. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Why measure What to measure ✅ “Good”

    metrics ⚖ Metric choices ⚒ Ways to measure Protect performance First, talk about why measure
  25. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics “I already know what to fix, let’s

    just fix it!” This is really tempting You probably already have an idea of what changes you need to make. “Yeah, we need to get our bundle size down, and we need to speed up this particular endpoint” However! Don’t let the takeaway from this talk be “I should go back and improve my app’s performance in this specific way”
  26. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics You can’t improve what you can’t measure

    This is a popular saying Meaning: changing (improving) something requires knowing what you want to change, and how you can tell if it worked. It’s useful because you can know if you’re doing the right things for your goals. However, I think it’s missing something
  27. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics You can’t meaningfully improve what you can’t

    meaningfully measure I’m going to drop the word “meaningful” in there. It’s important that you’re measuring the right things Measuring something is easy, but measuring the right thing is hard You can’t make meaningful impacts without careful thought about what you’re measuring! You can’t change the right things if you’re not measuring the right things
  28. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics You can’t meaningfully improve what you can’t

    meaningfully measure Like, really This slide deck contains the word meaningful 46 times
  29. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Better for users It’s what’s impactful for

    your users Ultimately, this is what it’s about Have a user centric metric As we said before, performance is a UX problem, so we need to measure what the users really are experiencing and put some thought into that
  30. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Why measure What to measure ✅ “Good”

    metrics ⚖ Metric choices ⚒ Ways to measure Protect performance Let’s take a look at what we can measure in order to improve the UX
  31. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics HTML Javascript CSS Assets Data { }

    * Which is, from our perspective, HTML, javascript, CSS, Assets, and data * So, how do these arrive in the users browser?
  32. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics DNS lookup GET request Parse HTML Fetch

    js Fetch CSS Parse js Parse CSS Layout / paint * How does the computer see what matters to our users * I’m going to go over the metrics themselves later, slide deck is available for reference * The way that this is put together is relevant for us so we can know which parts are meaningful for the user
  33. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Done? * This is what you’d traditionally

    consider a page to be * But, remember, we’re thinking about this in a meaningful context. What is meaningful to our users? * There’s more to what they might consider a “page” * With SPA’s and PWA’s, we need to be even more nuanced about what’s going on here
  34. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Fetch deferred js Parse deferred js Fetch

    more data Event handler * If we go forward in time a bit, we see there’s actually more
  35. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Fetch more data Event handler Push state

    Load data Rerender * You might reload data on button click * This is quite a different experience from the initial load, but a load nonetheless! *
  36. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics • A HTML document • A single

    page app state • An app state • Whatever your users think it is So, what’s a “page”? * We need to be keep in mind what our users would think of as a page, and have measurements around that experience
  37. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Load is not a single moment in

    time — it’s an experience that no one metric can fully capture. There are multiple moments during the load experience that can affect whether a user perceives it as "fast" or “slow" – https://w3c.github.io/paint-timing/ https://w3c.github.io/paint-timing/
  38. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Loading is not a boolean state There

    is no one event, or one easy answer to say when a page has loaded It might even require knowledge of the future! Loading is a spectrum Metrics compress this spectrum into a single number We need to be careful about how we choose this number
  39. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Measure these things! * We need to

    measure these things! * We need to pick metrics that takes these user attributes into account
  40. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Why measure What to measure ✅ “Good”

    metrics ⚖ Metric choices ⚒ Ways to measure Protect performance Let’s take a look at what makes a good metric
  41. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics ⚠ Bad metrics are out there! *

    Bad metrics are out there * The default way you think about it might be bad! * Your tools might be measuring it in a bad way
  42. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics ⚠ Bad metrics are dangerous * You’ll

    focus on the wrong things, neglecting real issues * You’ll change random numbers, but users will remain dissatisfied! * Bad news is good for your health if you need it
  43. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Reflects real experiences * Meaningful = meaningful

    for users * We need to figure out what our users are really experiencing with our metrics * There are a number of ways that we can ensure we’re measuring this for real users
  44. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics ✅ Real devices ✅ Real networks ✅

    Sanity check Reflects real experiences * If it is run on real devices, that’s a good sign * We need to know what real users networks are like * Do a sanity check. When is your metric being triggered? Is that state actually what you think it is?
  45. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Is a “continuous” function A small improvement

    in the metric relates to a small improvement in the UX. * Example of something that isn’t this: total bundle size, if we’re code splitting and downloading. Reducing may do nothing
  46. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Is a “continuous” function Side effect: no

    cheating! * Think of the people looking at this metric as a greedy optimisation algorithm * If they can cheat, they will * The shortest path to improve the metric should be the one that will improve the user experience * You shouldn’t be rewarded for loading a loading spinner very quickly, if that doesn’t result in a correspondingly good UX
  47. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Never send a human To do a

    machine’s job * As Agent Smith would say, “Never send a human to do a machines job” * Some metrics you can easily get in a repeatable way through monitoring * Others (like auditing tools) are tempting to have as one offs * Spend the time building automatic tooling to report. This is critical * If you rely on humans, this isn’t going to work * Historical data is your friend in arguments
  48. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Why measure What to measure ✅ “Good”

    metrics ⚖ Metric choices ⚒ Ways to measure Protect performance With the knowledge of what is a good and bad metric, lets look and some, and think about it for our use cases
  49. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Good for all apps With the knowledge

    of what is a good and bad metric, lets look and some, and think about it for our use cases
  50. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Page weight ✅ Easy to measure ✅

    js is expensive ⚠ Proxy for real experience * The cost of javascript blog by Addy Osmani
  51. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Paint timing! First paint First contentful paint

    First meaningful paint * You can guess my favourite (the one that has meaningful in the name) * First paint is when the browser first renders anything other than white * First contentful is when the browser renders any elements (eg. spinner) * First meaningful is when the browser renders something that is meaningful for the user (eg. data) * Easy to measure first two, but the last one is the real pot of gold. Harder to measure this in a consistent way.
  52. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Speed index Integrate the % of the

    page loaded over time * Speed index looks at what % of the page is loaded over time, retroactively * We then score it based on how much of it was delivered, and how soon * This really well maps to perceived load time * Optimizing for this number will help load time
  53. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Speed index ✅ Very reflective of real

    UX ✅ Reflects progressive loading ⚠ Hard to measure on real devices ⚠ Hard to understand (unitless) Integrate the % of the page loaded over time * It’s one of the best metrics for reflecting progressive loading
  54. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics ATF render time ✅ Reflective of UX

    for initial load ✅ Easy to understand ⚠ Doesn’t deal with post initial load ⚠ Hard to measure in real browsers ⚠ Your users probably scroll Time until all content above the fold is rendered * http://abovethefold.fyi/ *
  55. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Load event Triggered after DOMContentLoaded, and after

    js downloads ✅ Easy to measure ✅ Easy to understand ⚠ No async data requests ⚠ May not be meaningful * This one is super common, available everywhere easy to implement * Risky though
  56. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics First byte When did the first byte

    arrive in the browser? ✅ Easy to measure ✅ Measures backend problems ⚠ May not be meaningful * Performance timing API can help you measure this * It mightn’t mean anything if the first byte isn’t meaningful itself
  57. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Time to interactive The time it takes

    until buttons work ✅ Measure user interactions ✅ Highly interactive apps ⚠ Need a polyfill ⚠ Less meaningful after page load * Works by detecting CPU idle time, and picks a point where probably buttons on your page will work
  58. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics First input delay On the first click,

    how long did that take? ✅ Reflects actual user pain ⚠ Depends on user input (focus on 90th percentile) * First input delay reflects actual user problems * Will naturally depend on when the user interacts
  59. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Avoid With the knowledge of what is

    a good and bad metric, lets look and some, and think about it for our use cases
  60. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics DNS lookup GET request Parse HTML Fetch

    js ⏳DOMContentLoaded ⏳First byte Fetch CSS ⏳Navigation timing API Parse js Parse CSS Layout / paint ⏳First contentful paint * Remember this timeline from before? Now we can put some events on it to measure the various bits!
  61. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Fetch js Fetch deferred js ⏳DOMContentLoaded Parse

    deferred js Fetch CSS Parse js Parse CSS Fetch more data Layout / paint ⏳First paint ⏳First meaningful paint ⏳First CPU idle (interactive) ⏳First contentful paint
  62. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Fetch deferred js Parse deferred js Fetch

    more data ⏳First paint ⏳First meaningful paint ⏳First CPU idle (interactive) ⏳First contentful paint ⏳First input delay Event handler
  63. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Fetch more data ⏳First input delay Event

    handler Push state Load data Rerender ⏳Page reloaded * You might reload data on button click * This is quite a different experience from the initial load, but a load nonetheless!
  64. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Metric starter pack Time to interactive Bundle

    size First input delay ✨ First meaningful paint * If you don’t have time, these three strike a good balance between ease of adoption and in general meaningfulness. * But if you do, you should investigate first meaningful paint.
  65. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Why measure What to measure ✅ “Good”

    metrics ⚖ Metric choices ⚒ Ways to measure Protect performance Let’s take a look at what we can measure in order to improve the UX
  66. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics ⚒ Tools * A metric is only

    as good as your ability to measure it * We may need to make compromises in order to be able to measure things
  67. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Devtools ✅ debug ❌ repeatable ❌ actual

    users * One of the best tools is right under your nose * Measure just about all of the previous metrics in a debugging context * You can simulate real devices, but ultimately isn’t real * Hard to make a business case from the devtools
  68. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics User timing API * User timing API

    is available so you can get custom insights * Supported in IE11+ and all evergreen browsers
  69. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics github.com/jpnelson/react-component-timing * We have a tool that

    we use to measure component load speeds and first meaningful paint, built into React * It also marks things using the user timing API, and integrates with tracking tools * This is still being refined but open to PR’s and comments, please!
  70. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Lighthouse ✅ debug ❌ repeatable ❌ actual

    users * AKA the “Audits” tab in google Chrome * Newish integration into Google chrome * Does a good job of simulating real devices (eg network speeds and device speeds)
  71. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Lighthouse • First contentful paint • Speed

    index • Time to interactive • First meaningful paint* • Many more * First contentful paint, Speed index, first meaningful paint*
  72. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Lighthouse CI ✅ debug ✅ repeatable ❌

    actual users * Remember, never send a human to do a machine’s job * We need a repeatable measurement to make an impact * Lighthouse CI can tell you at PR time – before any regressions
  73. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics webpagetest.org ✅ debug ✅ repeatable ❌ actual

    users * webpagetest.org is an amazing tool for testing your website’s performance
  74. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics webpagetest.org ✅ debug ✅ repeatable ❌ actual

    users * Super detailed performance analysis, including details on hard to measure metrics like speed index * Can do analysis on the video, measuring speed index and ATF rendering easily * Simulates real users connections with rate limiting * You can choose locations, making it more real * Has an API to make repeatable things
  75. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Sentry / Google Analytics / New relic

    ❌ debug ✅ repeatable ✅ actual users * We use new relic * Can send custom page actions * Really useful for measuring data on your actual users
  76. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Why measure What to measure ✅ “Good”

    metrics ⚖ Metric choices ⚒ Ways to measure Protect performance Finally, protecting performance
  77. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Now what? ✅ Meaningful metric ✅ Tool

    to measure it Great, you’ve chosen a meaningful metric and you have the tools to measure it. Now what?
  78. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics We need to actually fix the stuff!

    Great, you’ve chosen a meaningful metric and you have the tools to measure it. Now what? * You need to fix performance issues * This is the easy bit! Plenty of info (including later today) about how to do that
  79. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics How much time will it take? How

    much time will it take? .. Not the right question to be asking. This is not a one off effort
  80. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics In it for the long run You

    need to build a culture of performance Metrics can help you do this
  81. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Can we add a video? Design: ✅

    Product: ✅ Eng: “It’ll be slower” Designers think it’s great Product people think it’d be cool Engineer: Wait, this will be slower…
  82. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics 1. Choose the right metrics 2. Get

    everyone to agree on a limit 3. Bring it up during planning 4. Figure out how to stay in budget Performance budgets * Getting people to agree might be easier than you think. People will agree to a meaningful metric. Eg, “Our time to interactive budget will be <=1 second” or “<= 1mb” (assuming the page weight is linked to a better UX) — https://speedcurve.com/blog/performance-budgets-in-action/
  83. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Choosing limits 0.1 seconds - instant 1.0

    seconds - flow of though 10 seconds - task switch What is the current state? Improve it What are your competitors doing?
  84. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics performancebudget.io * Estimate what a target bundle

    size would be for a given time * Normal considerations for bundle sizes though: not automatically meaningful
  85. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics 0.1s 1s 10s Instant “Flow ” Task

    sw itch < < < < < < < < < < < < < 4s < < Tolerating Satisfied Dissatisfied * The formula for Apdex
  86. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics 0.1s 1s 10s Instant “Flow ” Task

    sw itch < < < < < < < < < < < < < 4s < < Tolerating Satisfied Dissatisfied Apdex=(6+2.5)/15=0.57 * The formula for Apdex
  87. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics 0.1s 1s 10s Instant “Flow ” Task

    sw itch < < < < < < < < < < < < < 4s < < Load=DOMContentReady * With the wrong metric, you could be reporting incorrectly what real user experiences are like. * Apdex=1 while users silently suffer
  88. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics 0.1s 1s 10s Instant “Flow ” Task

    sw itch < < < < < < < < < < < < < 4s < < < < < < < T=2 8s < Tolerating Satisfied * Remember, tolerating = 4*t * Users with load times of up to 8 seconds are going to be happy
  89. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Company apdex = Jira * 0.2 +

    Confluence * 0.5 + Trello * 0.3 Note: note real numbers This is useful for us Note: each product defines their own access Share: t values, expectations, etc. Customise: What a “meaningful” state or interaction means for your users
  90. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Pick the right metric * Check what

    your metric is. Sanity check it. Is that really capturing the true user experience? *
  91. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Grow a performance culture * You need

    to institute a performance culture. * Performance needs to be something thought about during planning and design – not just engineering
  92. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Let’s make the internet better * The

    internet is a vehicle for free information for all * For me, $’s aren’t inspiring. But the free internet, the open exchange of information, is literally impossible for some people, unless the internet is fast.
  93. @nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Josh Nelson Atlassian @nelsonjoshpaul he/him/his Thanks! Questions?

    Illustrations designed by Freepik from flaticon.com tinyurl.com/meaningful-performance-metrics