Slide 1

Slide 1 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Josh Nelson Atlassian @nelsonjoshpaul he/him/his Meaningful performance metrics Measure the right stuff Illustrations designed by Freepik from flaticon.com

Slide 2

Slide 2 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics

Slide 3

Slide 3 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics 15.6s

Slide 4

Slide 4 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Median page load time on mobile

Slide 5

Slide 5 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Let’s fix it

Slide 6

Slide 6 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Developer @ Atlassian

Slide 7

Slide 7 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics * We make tools for teams * Hand up if you have ever used one of these tools * Honestly, even more stuff is out there than is listed here

Slide 8

Slide 8 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics * This is what Jira cloud looks like * If you haven’t used it for a while, check it out, we’ve made it even better

Slide 9

Slide 9 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Performance We’re going to talk about performance. It’s an something we’re focusing on at Atlassian We’re looking to improve real user experiences

Slide 10

Slide 10 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Who am I? But who am I to tell you about performance?

Slide 11

Slide 11 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics * Lived in Sydney Australia

Slide 12

Slide 12 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Flat white

Slide 13

Slide 13 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics

Slide 14

Slide 14 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Worlds tallest rubbish bin

Slide 15

Slide 15 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics * I know the pain of slow internet * They did this test when I lived there, in 2009 * The Prime Minister said: “If [the opposition] had their way Australians would be left using carrier pigeons for the future rather than accessing an internationally competitive broadband network” * So, they tested it. They flew a carrier pigeon with a 700 megabyte usb drive from the central west, to Sydney, about 100kms or 60 miles. https://www.itnews.com.au/news/australian-internet-fails-pigeon-test-159232

Slide 16

Slide 16 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics * The pigeon won, in 1:05 * They sent a car too, which took 2:10 * The internet dropped out twice and didn’t even make it

Slide 17

Slide 17 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics * Internet speeds suck out there. — “Never underestimate the bandwidth of a station wagon full of tapes hurtling down the highway. –Andrew Tanenbaum, 1981”

Slide 18

Slide 18 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics ADSL speed in Australia: ~7990kbps * This is kind of shocking for anyone who visits Australia from America

Slide 19

Slide 19 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Good news! Internet gets faster over time * But don’t worry! As technology changes, things get better over time automatically!

Slide 20

Slide 20 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics * Internet speed in the US over time, in megabits per second. Going up! https://www.statista.com/statistics/616210/average-internet-connection-speed-in-the-us/

Slide 21

Slide 21 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics * And great! Thanks to Moore’s law, CPU speeds have been going up for years! * Sweet, problem solved, right? https://www.statista.com/statistics/616210/average-internet-connection-speed-in-the-us/

Slide 22

Slide 22 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics But the web isn’t getting faster!

Slide 23

Slide 23 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics * Massive survey on the state of javascript * Median “onload” time 5.5 seconds or 15.6 seconds on mobile * https://httparchive.org/reports/state-of-javascript?start=earliest&end=latest&view=list

Slide 24

Slide 24 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Why? * We’re getting

Slide 25

Slide 25 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics * There’s a similar thing with highways. * When they add highway lanes, traffic doesn’t improve

Slide 26

Slide 26 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics 1% increase in capacity → Up to 1.1% increase in demand * The “induced demand” effect for every 1 percent increase in highway capacity, traffic increases 0.29 to 1.1 percent in the long term (about five years out), and up to 0.68 percent in the short term (one or two years) https://trrjournalonline.trb.org/doi/abs/10.3141/2653-02?journalCode=trr

Slide 27

Slide 27 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics * Same survey shows us that page weight is going up too * https://httparchive.org/reports/state-of-javascript?start=earliest&end=latest&view=list

Slide 28

Slide 28 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics It’s not getting better

Slide 29

Slide 29 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics But for the sake of the user, it needs to. * Performance is a problem

Slide 30

Slide 30 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Performance impacts 1. Business goals 2. User happiness 3. The world * Performance is a problem

Slide 31

Slide 31 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Business goals One second delay in Bing results in a 2.8% drop in revenue. Two second delay results in 4.3% drop. * https://wpostats.com/tags/revenue/

Slide 32

Slide 32 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Business goals Walmart: 2% increase in conversions for every 1 second of improvement in load time. Every 100ms improvement also resulted in up to a 1% increase in revenue. * It’s a UX problem * https://wpostats.com/tags/revenue/

Slide 33

Slide 33 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Business goals Google's DoubleClick found that publishers whose mobile sites load in 5 seconds earn up to 2x more mobile ad revenue than sites loading in 19 seconds. * It’s a UX problem * https://wpostats.com/tags/revenue/

Slide 34

Slide 34 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Business goals * Show this picture to your manager and they’ll be immediately convinced

Slide 35

Slide 35 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics The world “Youtube feather” * Made a version of youtube 90% lighter * Opt in to small % of traffic. * But average load time went up??

Slide 36

Slide 36 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics The world Southeast Asia, South America, Africa and Siberia. * Actually more traffic from countries with poor connectivity! * These countries would’ve normally taken two minutes to load! * Australia not far off * Actually affects whether people can use it at all * https://wpostats.com/2015/11/11/youtube-feather.html

Slide 37

Slide 37 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics User happiness 0.1s 1s 10s Instant “Flow ” Task sw itch * Research has shown that there are three different types of tasks * <0.1s Instant – the system is reacting immediately * <1s Flow – I’m in a state of flow and my concentration is not broken between tasks * <10s Task switch – I’m out of here * https://www.nngroup.com/articles/response-times-3-important-limits/

Slide 38

Slide 38 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics User happiness 0.1s 1s 10s Instant “Flow ” Task sw itch ☹ * Between 1s and 10s is a whole lot of emotion too * Users are progressively getting frustrated and want to switch to another task in this period of time

Slide 39

Slide 39 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics User happiness 0.1s 1s 10s Instant “Flow ” Task sw itch ☹ Median desktop 5.5s 15.6s Median mobile * Between 1s and 10s is a whole lot of emotion too * Users are progressively getting frustrated and want to switch to another task in this period of time

Slide 40

Slide 40 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Poor performance is an ignored problem * Not only is it a problem, it’s an ignored problem

Slide 41

Slide 41 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Preaching to the choir * I’m sure I’m preaching to the choir here, if you’ve come to this talk. * Do you agree with these statistics that I’ve mentioned so far? * Have you already heard it before? * Do you really care?

Slide 42

Slide 42 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Then, why? Then, why can’t we instigate change across our organizations?

Slide 43

Slide 43 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics “It’s such a big problem, we can’t tackle it” The “give up” Then, why?

Slide 44

Slide 44 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics “It’s ok, our users have fast internet” The denial Then, why?

Slide 45

Slide 45 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics 1: Oh god “performance” is terrible 2: We fixed performance! 3: Wait 4: GOTO 1 The loop Then, why?

Slide 46

Slide 46 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics This continues until the inevitable heat death of the universe Ever expanding entropy claims all. Chaos reigns supreme We can’t control it. Performance will always regress

Slide 47

Slide 47 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Or, we try Or, we try and beat it.

Slide 48

Slide 48 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Meaningful metrics can save you Meaningful metrics can save you Let me explain how

Slide 49

Slide 49 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Why measure What to measure ✅ “Good” metrics ⚖ Metric choices ⚒ Ways to measure Protect performance Here’s an overview of what I’m going to talk about today

Slide 50

Slide 50 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Why measure What to measure ✅ “Good” metrics ⚖ Metric choices ⚒ Ways to measure Protect performance First, talk about why measure

Slide 51

Slide 51 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics “I already know what to fix, let’s just fix it!” This is really tempting You probably already have an idea of what changes you need to make. “Yeah, we need to get our bundle size down, and we need to speed up this particular endpoint” However! Don’t let the takeaway from this talk be “I should go back and improve my app’s performance in this specific way”

Slide 52

Slide 52 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics You can’t improve what you can’t measure This is a popular saying Meaning: changing (improving) something requires knowing what you want to change, and how you can tell if it worked. It’s useful because you can know if you’re doing the right things for your goals. However, I think it’s missing something

Slide 53

Slide 53 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics You can’t meaningfully improve what you can’t meaningfully measure I’m going to drop the word “meaningful” in there. It’s important that you’re measuring the right things Measuring something is easy, but measuring the right thing is hard You can’t make meaningful impacts without careful thought about what you’re measuring! You can’t change the right things if you’re not measuring the right things

Slide 54

Slide 54 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics You can’t meaningfully improve what you can’t meaningfully measure Just want to really emphasise that

Slide 55

Slide 55 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics You can’t meaningfully improve what you can’t meaningfully measure Like, really This slide deck contains the word meaningful 46 times

Slide 56

Slide 56 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics What is meaningful? As plato would say,

Slide 57

Slide 57 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Better for users It’s what’s impactful for your users Ultimately, this is what it’s about Have a user centric metric As we said before, performance is a UX problem, so we need to measure what the users really are experiencing and put some thought into that

Slide 58

Slide 58 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Why measure What to measure ✅ “Good” metrics ⚖ Metric choices ⚒ Ways to measure Protect performance Let’s take a look at what we can measure in order to improve the UX

Slide 59

Slide 59 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics “Performance” We often talk about “performance”. But how does that apply to our users?

Slide 60

Slide 60 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics The time for a page to load for a user “Performance”

Slide 61

Slide 61 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics The time for a page to load for a user “Performance”

Slide 62

Slide 62 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics This is a page, you might be familiar with it

Slide 63

Slide 63 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Meaning Behaviour Appearance Media Information * If we’re going to think of it from the users perspective, it’s this

Slide 64

Slide 64 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics HTML Javascript CSS Assets Data { } * Which is, from our perspective, HTML, javascript, CSS, Assets, and data * So, how do these arrive in the users browser?

Slide 65

Slide 65 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics DNS lookup GET request Parse HTML Fetch js Fetch CSS Parse js Parse CSS Layout / paint * How does the computer see what matters to our users * I’m going to go over the metrics themselves later, slide deck is available for reference * The way that this is put together is relevant for us so we can know which parts are meaningful for the user

Slide 66

Slide 66 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Fetch js Fetch deferred js Parse deferred js Fetch CSS Parse js Parse CSS Fetch more data Layout / paint

Slide 67

Slide 67 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Fetch deferred js Parse deferred js Fetch more data Event handler

Slide 68

Slide 68 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Done? * This is what you’d traditionally consider a page to be * But, remember, we’re thinking about this in a meaningful context. What is meaningful to our users? * There’s more to what they might consider a “page” * With SPA’s and PWA’s, we need to be even more nuanced about what’s going on here

Slide 69

Slide 69 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Fetch deferred js Parse deferred js Fetch more data Event handler * If we go forward in time a bit, we see there’s actually more

Slide 70

Slide 70 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Fetch more data Event handler Push state Load data Rerender * You might reload data on button click * This is quite a different experience from the initial load, but a load nonetheless! *

Slide 71

Slide 71 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics • A HTML document • A single page app state • An app state • Whatever your users think it is So, what’s a “page”? * We need to be keep in mind what our users would think of as a page, and have measurements around that experience

Slide 72

Slide 72 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics The time for a page to load for a user “Performance”

Slide 73

Slide 73 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics This is a slowed down page load. Put your hand up, and lower it when you think the page is loaded

Slide 74

Slide 74 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Load is not a single moment in time — it’s an experience that no one metric can fully capture. There are multiple moments during the load experience that can affect whether a user perceives it as "fast" or “slow" – https://w3c.github.io/paint-timing/ https://w3c.github.io/paint-timing/

Slide 75

Slide 75 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Loading is not a boolean state There is no one event, or one easy answer to say when a page has loaded It might even require knowledge of the future! Loading is a spectrum Metrics compress this spectrum into a single number We need to be careful about how we choose this number

Slide 76

Slide 76 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics The time for a page to load for a user “Performance”

Slide 77

Slide 77 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Who your users are might be obvious, but it might be hard to determine

Slide 78

Slide 78 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Speed * Fast CPU, or slow CPU

Slide 79

Slide 79 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Location * Great internet in Richmond VA * Terrible internet in Parkes, Australia

Slide 80

Slide 80 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Regularity * Return visitor to your website * First time visitor to your website

Slide 81

Slide 81 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Measure these things! * We need to measure these things! * We need to pick metrics that takes these user attributes into account

Slide 82

Slide 82 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Why measure What to measure ✅ “Good” metrics ⚖ Metric choices ⚒ Ways to measure Protect performance Let’s take a look at what makes a good metric

Slide 83

Slide 83 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics ⚠ Bad metrics are out there! * Bad metrics are out there * The default way you think about it might be bad! * Your tools might be measuring it in a bad way

Slide 84

Slide 84 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics * Project I am working on * Slowed down video * By default, we measured “load” time *

Slide 85

Slide 85 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics * Load event is fired here * Is this loaded?

Slide 86

Slide 86 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics ⚠ Bad metrics are dangerous * You’ll focus on the wrong things, neglecting real issues * You’ll change random numbers, but users will remain dissatisfied! * Bad news is good for your health if you need it

Slide 87

Slide 87 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Good metrics * Let’s look at some good metrics

Slide 88

Slide 88 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Reflects real experiences * Meaningful = meaningful for users * We need to figure out what our users are really experiencing with our metrics * There are a number of ways that we can ensure we’re measuring this for real users

Slide 89

Slide 89 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics ✅ Real devices ✅ Real networks ✅ Sanity check Reflects real experiences * If it is run on real devices, that’s a good sign * We need to know what real users networks are like * Do a sanity check. When is your metric being triggered? Is that state actually what you think it is?

Slide 90

Slide 90 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Is a “continuous” function * Another quality of a good metric is one that is a continuous function

Slide 91

Slide 91 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Is a “continuous” function A small improvement in the metric relates to a small improvement in the UX. * Example of something that isn’t this: total bundle size, if we’re code splitting and downloading. Reducing may do nothing

Slide 92

Slide 92 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Is a “continuous” function Side effect: no cheating! * Think of the people looking at this metric as a greedy optimisation algorithm * If they can cheat, they will * The shortest path to improve the metric should be the one that will improve the user experience * You shouldn’t be rewarded for loading a loading spinner very quickly, if that doesn’t result in a correspondingly good UX

Slide 93

Slide 93 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics ♻ Is repeatable

Slide 94

Slide 94 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Never send a human To do a machine’s job * As Agent Smith would say, “Never send a human to do a machines job” * Some metrics you can easily get in a repeatable way through monitoring * Others (like auditing tools) are tempting to have as one offs * Spend the time building automatic tooling to report. This is critical * If you rely on humans, this isn’t going to work * Historical data is your friend in arguments

Slide 95

Slide 95 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Why measure What to measure ✅ “Good” metrics ⚖ Metric choices ⚒ Ways to measure Protect performance With the knowledge of what is a good and bad metric, lets look and some, and think about it for our use cases

Slide 96

Slide 96 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics ⚡ Metric sound off! * Let’s finally have a look at some metrics

Slide 97

Slide 97 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics ⚡ Metric sound off! joshnelson.io/pages/metrics * Let’s finally have a look at some metrics

Slide 98

Slide 98 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Good for all apps With the knowledge of what is a good and bad metric, lets look and some, and think about it for our use cases

Slide 99

Slide 99 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Page weight ✅ Easy to measure ✅ js is expensive ⚠ Proxy for real experience * The cost of javascript blog by Addy Osmani

Slide 100

Slide 100 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Paint timing! First paint First contentful paint First meaningful paint * You can guess my favourite (the one that has meaningful in the name) * First paint is when the browser first renders anything other than white * First contentful is when the browser renders any elements (eg. spinner) * First meaningful is when the browser renders something that is meaningful for the user (eg. data) * Easy to measure first two, but the last one is the real pot of gold. Harder to measure this in a consistent way.

Slide 101

Slide 101 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Speed index Integrate the % of the page loaded over time * Speed index looks at what % of the page is loaded over time, retroactively * We then score it based on how much of it was delivered, and how soon * This really well maps to perceived load time * Optimizing for this number will help load time

Slide 102

Slide 102 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Speed index ✅ Very reflective of real UX ✅ Reflects progressive loading ⚠ Hard to measure on real devices ⚠ Hard to understand (unitless) Integrate the % of the page loaded over time * It’s one of the best metrics for reflecting progressive loading

Slide 103

Slide 103 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics ATF render time ✅ Reflective of UX for initial load ✅ Easy to understand ⚠ Doesn’t deal with post initial load ⚠ Hard to measure in real browsers ⚠ Your users probably scroll Time until all content above the fold is rendered * http://abovethefold.fyi/ *

Slide 104

Slide 104 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Load event Triggered after DOMContentLoaded, and after js downloads ✅ Easy to measure ✅ Easy to understand ⚠ No async data requests ⚠ May not be meaningful * This one is super common, available everywhere easy to implement * Risky though

Slide 105

Slide 105 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Good for SSR apps

Slide 106

Slide 106 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics First byte When did the first byte arrive in the browser? ✅ Easy to measure ✅ Measures backend problems ⚠ May not be meaningful * Performance timing API can help you measure this * It mightn’t mean anything if the first byte isn’t meaningful itself

Slide 107

Slide 107 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Time to interactive The time it takes until buttons work ✅ Measure user interactions ✅ Highly interactive apps ⚠ Need a polyfill ⚠ Less meaningful after page load * Works by detecting CPU idle time, and picks a point where probably buttons on your page will work

Slide 108

Slide 108 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics First input delay On the first click, how long did that take? ✅ Reflects actual user pain ⚠ Depends on user input (focus on 90th percentile) * First input delay reflects actual user problems * Will naturally depend on when the user interacts

Slide 109

Slide 109 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Avoid With the knowledge of what is a good and bad metric, lets look and some, and think about it for our use cases

Slide 110

Slide 110 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics DOMContentLoaded ⚠ Basically never what you want ⚠ Try and use the load event instead Parse HTML and Synchronous js

Slide 111

Slide 111 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics DNS lookup GET request Parse HTML Fetch js ⏳DOMContentLoaded ⏳First byte Fetch CSS ⏳Navigation timing API Parse js Parse CSS Layout / paint ⏳First contentful paint * Remember this timeline from before? Now we can put some events on it to measure the various bits!

Slide 112

Slide 112 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Fetch js Fetch deferred js ⏳DOMContentLoaded Parse deferred js Fetch CSS Parse js Parse CSS Fetch more data Layout / paint ⏳First paint ⏳First meaningful paint ⏳First CPU idle (interactive) ⏳First contentful paint

Slide 113

Slide 113 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Fetch deferred js Parse deferred js Fetch more data ⏳First paint ⏳First meaningful paint ⏳First CPU idle (interactive) ⏳First contentful paint ⏳First input delay Event handler

Slide 114

Slide 114 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Fetch more data ⏳First input delay Event handler Push state Load data Rerender ⏳Page reloaded * You might reload data on button click * This is quite a different experience from the initial load, but a load nonetheless!

Slide 115

Slide 115 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics joshnelson.io/pages/metrics * Let’s finally have a look at some metrics

Slide 116

Slide 116 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Metric starter pack Time to interactive Bundle size First input delay ✨ First meaningful paint * If you don’t have time, these three strike a good balance between ease of adoption and in general meaningfulness. * But if you do, you should investigate first meaningful paint.

Slide 117

Slide 117 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Why measure What to measure ✅ “Good” metrics ⚖ Metric choices ⚒ Ways to measure Protect performance Let’s take a look at what we can measure in order to improve the UX

Slide 118

Slide 118 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics ⚒ Tools * A metric is only as good as your ability to measure it * We may need to make compromises in order to be able to measure things

Slide 119

Slide 119 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Devtools ✅ debug ❌ repeatable ❌ actual users * One of the best tools is right under your nose * Measure just about all of the previous metrics in a debugging context * You can simulate real devices, but ultimately isn’t real * Hard to make a business case from the devtools

Slide 120

Slide 120 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics User timing API * User timing API is available so you can get custom insights * Supported in IE11+ and all evergreen browsers

Slide 121

Slide 121 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics github.com/jpnelson/react-component-timing * We have a tool that we use to measure component load speeds and first meaningful paint, built into React * It also marks things using the user timing API, and integrates with tracking tools * This is still being refined but open to PR’s and comments, please!

Slide 122

Slide 122 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Lighthouse ✅ debug ❌ repeatable ❌ actual users * AKA the “Audits” tab in google Chrome * Newish integration into Google chrome * Does a good job of simulating real devices (eg network speeds and device speeds)

Slide 123

Slide 123 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Lighthouse • First contentful paint • Speed index • Time to interactive • First meaningful paint* • Many more * First contentful paint, Speed index, first meaningful paint*

Slide 124

Slide 124 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Lighthouse * Gives you actionable things to do, making it a very good debugging tool

Slide 125

Slide 125 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Lighthouse CI ✅ debug ✅ repeatable ❌ actual users * Remember, never send a human to do a machine’s job * We need a repeatable measurement to make an impact * Lighthouse CI can tell you at PR time – before any regressions

Slide 126

Slide 126 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics webpagetest.org ✅ debug ✅ repeatable ❌ actual users * webpagetest.org is an amazing tool for testing your website’s performance

Slide 127

Slide 127 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics webpagetest.org ✅ debug ✅ repeatable ❌ actual users * Super detailed performance analysis, including details on hard to measure metrics like speed index * Can do analysis on the video, measuring speed index and ATF rendering easily * Simulates real users connections with rate limiting * You can choose locations, making it more real * Has an API to make repeatable things

Slide 128

Slide 128 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Sentry / Google Analytics / New relic ❌ debug ✅ repeatable ✅ actual users * We use new relic * Can send custom page actions * Really useful for measuring data on your actual users

Slide 129

Slide 129 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Why measure What to measure ✅ “Good” metrics ⚖ Metric choices ⚒ Ways to measure Protect performance Finally, protecting performance

Slide 130

Slide 130 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Now what? ✅ Meaningful metric ✅ Tool to measure it Great, you’ve chosen a meaningful metric and you have the tools to measure it. Now what?

Slide 131

Slide 131 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics We need to actually fix the stuff! Great, you’ve chosen a meaningful metric and you have the tools to measure it. Now what? * You need to fix performance issues * This is the easy bit! Plenty of info (including later today) about how to do that

Slide 132

Slide 132 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics How much time will it take? How much time will it take? .. Not the right question to be asking. This is not a one off effort

Slide 133

Slide 133 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics In it for the long run You need to build a culture of performance Metrics can help you do this

Slide 134

Slide 134 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Let’s make this new page! And maybe we have the background put a video in instead?

Slide 135

Slide 135 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Can we add a video? Design: ✅ Product: ✅ Eng: “It’ll be slower” Designers think it’s great Product people think it’d be cool Engineer: Wait, this will be slower…

Slide 136

Slide 136 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Performance budgets Performance budgets are a structured way to have this conversation

Slide 137

Slide 137 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics 1. Choose the right metrics 2. Get everyone to agree on a limit 3. Bring it up during planning 4. Figure out how to stay in budget Performance budgets * Getting people to agree might be easier than you think. People will agree to a meaningful metric. Eg, “Our time to interactive budget will be <=1 second” or “<= 1mb” (assuming the page weight is linked to a better UX) — https://speedcurve.com/blog/performance-budgets-in-action/

Slide 138

Slide 138 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Choosing limits I can’t just tell you :(

Slide 139

Slide 139 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Choosing limits 0.1 seconds - instant 1.0 seconds - flow of though 10 seconds - task switch What is the current state? Improve it What are your competitors doing?

Slide 140

Slide 140 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics performancebudget.io * Estimate what a target bundle size would be for a given time * Normal considerations for bundle sizes though: not automatically meaningful

Slide 141

Slide 141 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Apdex * A huge part of performance metrics I haven’t mentioned yet: What about Apdex?

Slide 142

Slide 142 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Load time < “T value” means satisfied Load time < T*4 means tolerating * The formula for Apdex

Slide 143

Slide 143 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics 0.1s 1s 10s Instant “Flow ” Task sw itch < < < < < < < < < < < < < 4s < < * The formula for Apdex

Slide 144

Slide 144 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics 0.1s 1s 10s Instant “Flow ” Task sw itch < < < < < < < < < < < < < 4s < < Tolerating Satisfied Dissatisfied * The formula for Apdex

Slide 145

Slide 145 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics 0.1s 1s 10s Instant “Flow ” Task sw itch < < < < < < < < < < < < < 4s < < Tolerating Satisfied Dissatisfied Apdex=(6+2.5)/15=0.57 * The formula for Apdex

Slide 146

Slide 146 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics ⚠ Pick your T value meaningfully ⚠ Pick your load metric meaningfully

Slide 147

Slide 147 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics ⚠ Pick your T value meaningfully ⚠ Pick your load metric meaningfully

Slide 148

Slide 148 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics 0.1s 1s 10s Instant “Flow ” Task sw itch < < < < < < < < < < < < < 4s < < Load=DOMContentReady * With the wrong metric, you could be reporting incorrectly what real user experiences are like. * Apdex=1 while users silently suffer

Slide 149

Slide 149 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics 0.1s 1s 10s Instant “Flow ” Task sw itch < < < < < < < < < < < < < 4s < < < < < < < T=2 8s < Tolerating Satisfied * Remember, tolerating = 4*t * Users with load times of up to 8 seconds are going to be happy

Slide 150

Slide 150 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics You only get out what you put in

Slide 151

Slide 151 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Useful as a tool for businesses

Slide 152

Slide 152 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Company apdex = Jira * 0.2 + Confluence * 0.5 + Trello * 0.3 Note: note real numbers This is useful for us Note: each product defines their own access Share: t values, expectations, etc. Customise: What a “meaningful” state or interaction means for your users

Slide 153

Slide 153 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Summary

Slide 154

Slide 154 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Measure real user experiences

Slide 155

Slide 155 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Pick the right metric * Check what your metric is. Sanity check it. Is that really capturing the true user experience? *

Slide 156

Slide 156 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Grow a performance culture * You need to institute a performance culture. * Performance needs to be something thought about during planning and design – not just engineering

Slide 157

Slide 157 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Let’s make the internet better * The internet is a vehicle for free information for all * For me, $’s aren’t inspiring. But the free internet, the open exchange of information, is literally impossible for some people, unless the internet is fast.

Slide 158

Slide 158 text

@nelsonjoshpaul jpnelson https://tinyurl.com/meaningful-performance-metrics Josh Nelson Atlassian @nelsonjoshpaul he/him/his Thanks! Questions? Illustrations designed by Freepik from flaticon.com tinyurl.com/meaningful-performance-metrics