Contribute to the 2021 edition happening this December, click.

Web Performance Calendar

The speed geek's favorite time of year
2020 Edition

Rick Viscomi

Rick Viscomi (@rick_viscomi) is a Senior Developer Programs Engineer at Google working on web transparency initiatives like the Chrome UX Report and HTTP Archive. Rick is also the host of the video series The State of the Web and a coauthor of Using WebPageTest.

Web performance can mean a lot of different things to a lot of different people. Fundamentally, it’s a question of how fast a web page is. But fast to whom?

When this page loaded moments ago, was it fast? If so, congratulations, you had a fast experience. So ask yourself, does that make this a fast page? Not so fast! Just because you had a fast experience doesn’t mean everyone else does too. You might even revisit this page and have yourself a slow experience.

Let’s say that you and everyone else who load this page all have fast experiences. Surely that makes it a fast page, right? Most people would agree. Hypothetically, what if everyone’s internet speeds get 100x slower overnight? Now, all experiences on this page are suddenly slow. Is this page, which is byte-for-byte identical as it was yesterday, still fast?

Fast is a concept that exists in the minds of users as they browse the web. It’s not that the page is fast—the experience is fast.

Ok, that’s enough philosophy. Why does it matter? Because there’s a difference between a page that’s built for speed and a page that feels fast. A svelte page could feel slow to someone having network issues. A heavily unoptimized page could feel fast to someone on high-end hardware. The proportions of those types of users can determine how fast a page is experienced in aggregate, even more so than how well-optimized it actually is.

How you approach measuring a web page’s performance can tell you whether it’s built for speed or whether it feels fast. We call them lab and field tools. Lab tools are the microscopes that inspect a page for all possible points of friction. Field tools are the binoculars that give you an overview of how users are experiencing the page.

A lab tool like WebPageTest or Lighthouse can tell you thousands of facts about how the page was built and how quickly the page loaded from its perspective. This makes lab tools irreplaceable for inspecting and diagnosing performance issues. You can visualize every step of the page load and drill down into what’s holding it up. Lab tools can even make informed recommendations for things they think you should fix, saving you the investigative time and effort. But despite their advantages, lab tools can lead you astray in subtle ways.

Similar to the problem of your fast experience not necessarily reflecting everyone else’s, your lab test might not be configured like most users in two important ways: access and behavior. A lab tool accesses a web page from a specific hardware and network configuration, which can greatly affect the page’s loading performance. A lab tool might not behave in ways that mimic real users either, for example the test might not be logged in, scroll the page after it loads, nor click on buttons.

This problem is becoming more and more apparent as developers rightly focus on user-centric metrics. Core Web Vitals represent a few distinct aspects of a good user experience: loading performance, input responsiveness, and layout stability. These are measured by Largest Contentful Paint (LCP), First Input Delay (FID), and Cumulative Layout Shift (CLS) respectively. So what could go wrong with measuring these metrics in the lab?

LCP is the time that the biggest content loaded on screen. The times at which things load can be highly dependent on how fast the network is, so the lab configuration can produce wildly different LCP values based on its bandwidth and latency settings. Large content like images may also be cached and immediately available for some users, but lab tests tend to run with empty caches, necessitating another trip over the network.

FID is the delay from first interacting with the page, like a click, to the time that the browser is ready to respond to it. The main thread could be so busy with script execution or DOM construction that the event handler has to wait its turn. The obvious limitation with testing a page in the lab is that there aren’t any users to interact with it! There are diagnostic metrics for interactivity in the lab, like Total Blocking Time (TBT), but these don’t actually measure the user experience. We can fake FID and simulate a user’s click, but the questions of what to click and when to click it can be very subjective.

CLS is roughly the proportion of the viewport that shifted as a result of layout instability. A layout could have a moment of instability when elements are suddenly added or removed and the positions of neighboring contents shift. Because the layout shift score is a proportion of the viewport, CLS can be very different between phones and desktops. The type of device used or emulated in the lab directly affects how CLS is calculated. There’s another issue having to do with user behavior: when to stop measuring. Lab tools tend to stop when the page is loaded, but real users are just getting started interacting with the page and potentially incurring many more layout shifts. Real users scroll and click and trigger new sorts of conditions that contribute to layout instability. Simulating these behaviors in the lab would be closer to reality but it has similar challenges to FID.

This is why field data is the ground truth for how a page is experienced. At best we can only simulate user experiences in the lab, and we’d still be hypothesizing how a user would access a page and how they’d behave once they get there.

But wait! What if we calibrate our lab configurations based on real-user data from the field? This isn’t a new idea; developers have been calibrating access factors like geographic location, browser, and network speed based on field data for years. But now it’s more important than ever to calibrate behavior as well. For example, we can use analytics to see what users tend to click on first and when they click on it.

Some lab tools like WebPageTest are advanced enough to be able to script that behavior into the test. But a popular tool like PageSpeed Insights (PSI) has no configurability beyond plugging in the URL you want to test, so you need to take its lab results with a grain of salt. Keep in mind that performance is a distribution, and one lab test is just a single contrived data point.

Fear not, even unrealistic lab tests can still be useful. One practical application of this is to test for worst case scenarios. You may not be able to say with certainty that anyone who visits your page will have a fast experience, but if you can make it seem fast under even the slowest conditions, that goes a long way. Stress testing your page’s performance by using (or emulating) low-end hardware over strained network speeds is a great way to magnify the power of the microscope to bring more performance problems into focus. This is an opportunity to fix issues before users may even experience them.

What if users aren’t experiencing this slow performance because they’re conditioned not to? An experience can be so poor that the user abandons it before it gets any worse. They may never come back to the site at all, in which case your field data has survivorship bias where only the bearably slow experiences are measured. How many unbearably slow experiences aren’t you measuring? And you thought we were done with the philosophical questions!

Let’s stop here and recap:

  • Individual experiences are just data points along a distribution. What feels fast depends on the conditions under which it was experienced. Everyone’s conditions are different.
  • Lab tests may not be configured to be representative of the most common experiences on the curve, or any experience on the curve for that matter.
  • User-centric metrics require extra care to ensure that behaviors are emulated faithfully in the lab.

As a web development community, we need to change more than just our mindset about “fast” web pages. It’s not enough to be aware of the pitfalls that can lead us astray: to avoid them requires a concerted effort between the makers and users of performance testing tools.

Lab tools must be configurable to access and behave like real users. There is no one-size-fits-all lab configuration that represents how users experience all pages. Developers need to be active participants in the configuration process—not necessarily down to the Mbps of bandwidth, but they should make high-level decisions about what type of user they’re simulating in the lab. This could be a manual guessing game, but at least developers are made more conscious of the relevance of the results.

An even better solution would be to build stronger data bridges between field and lab tools, so that the lab tool itself can make informed recommendations about the most realistic user profiles to simulate.

We’re at an exciting inflection point in the power of developer tooling. As newer metrics focus on how pages are experienced from users’ perspectives, we have an opportunity to rethink and reshape the ways our tools help us to measure and optimize them. By instrumenting lab tools with the behavioral characteristics of real users, we can unlock new opportunities to improve experiences beyond the page load.

24 Responses to “The mythical “fast” web page”

  1. anatol broder

    (1) is the mythical «web development community» a google term for chrome devtools users?

    (2) why do you want me to load two photographs (your smiling face + random bronze sculpture = 84% of bytes) before i read your article about poor user experience?

    (3) why there is no viewport meta tag on this page?

  2. Rick Viscomi

    Silver, actually

  3. Collective #638 - Coduza - Blog

    […] The mythical “fast” web page […]

  4. Collective #638 - GB YouTube - Blog

    […] The mythical “fast” web page […]

  5. Katsampu

    Hi Rick, nice post!

    And here is a story of the rendering performance of a website with a huge amount of traffic (1.5million daily).

  6. Web Design & Development News: Collective #638 | Codrops

    […] The mythical “fast” web page […]

  7. Ingo Steinke

    Thanks for the article! Verified the effect yesterday, when a lighthouse report of my old, simple website with nearly no assets nor javascript showed very poor results, while in WebPageTest, it scored brilliantly. In my subjective view, it always loaded instantly anyway. So we should always question not only our results but also our assumptions.

  8. stoyan

    (3) why there is no viewport meta tag on this page?

    I’m working on a redesign, will address this

  9. Web Performance Calendar – Computer Science feeds

    […] The Web Performance Calendar just started up again this year. The first two posts so far are about, well, performance! First up, Rick Viscomi writes about the mythical “fast” web page: […]

  10. 웹 성능 캘린더 - WP Info

    […] 지금까지 처음 두 게시물은 성능에 관한 것입니다! 먼저 Rick Viscomi는 신화적인”빠른”웹 페이지 에 대해 다음과 같이 […]

  11. Web Performance Calendar

    […] The Web Performance Calendar just started up again this year. The first two posts so far are about, well, performance! First up, Rick Viscomi writes about the mythical “fast” web page: […]

  12. Web Performance Calendar » Pixallus

    […] The Web Performance Calendar just started up again this year. The first two posts so far are about, well, performance! First up, Rick Viscomi writes about the mythical “fast” web page: […]

  13. Anderson

    danluu dot com is fast.
    This page isn’t.

  14. oldschool

    why would someone write this post? aka what does the author want to believe?
    The author believes that the variability of hardware and devices today is reason enough to not worry about webpage optimization. So any complaints about a webpage is therefore probably highly likely related to the user.
    Instead of writing this post, you could have just said “works on my machine” which ironically would lead to a faster load of this page instead of this vacuous garbage.

  15. Ben Marshall

    Good read! Especially like that you mention:

    Keep in mind that performance is a distribution, and one lab test is just a single contrived data point.

    I have to remind some co-workers daily that those scores aren’t the be-all-end-all. So frustrating when you tell them you’re getting one score, but then they complain they’re getting something else.

    Know of any good blog posts about those scores and more specifics about why each run is different?

  16. Sam

    This post sounds like it’s trying really hard to rationalize or justify something.

    What exactly are you doing? Just do less of it. The culprits are never a surprise: lots of requests, JS, CSS, custom fonts, mis-sized images, data which isn’t necessary for the page to display, not using a CDN, using an interpreted language to serve up static pages, etc. You can load an entire front page in the blink of an eye, even on a slow cellular connection.

    I’ve never had such “high-end hardware” that a slow page feels fast. They only feel slightly less slow. We still know they’re slow. You’re not fooling anyone.

  17. sulfide

    fast webpages dont matter that much to me, take reddit for example, slow AF and i dont care and i’ll still use it as long as it provides me value. I just want reasonable in some expectation that I am feeling at that moment in time. but that could change minute to minute 😀 I would rather developers providing me features and less bugs than how fast it’s going to be

  18. Ted

    sulfide: That’s called Selection Bias. Reddit has gotten slower and added lots of features, so Reddit users are naturally the set of people who value features over speed. People who don’t share those priorities have left for other websites.

    There’s plenty of research to support the claim that internet users, on the whole, do care greatly about page load speed. I think it’s great that you found a website you like, but be careful not to mistake one’s personal opinion for a majority viewpoint. Social media is notorious for nudging people to conflate the two.

  19. Collective #638 – Enjoy Web

    […] The mythical “fast” web page […]

  20. Infused Agency

    Great article! Love how you made the distinction between, ‘yes this web page loads fast’ vs ‘is it actually fast though?’

  21. Gaurav Gupta

    Just to reiterate, you are suggesting that the lab tools provide an understanding of whether the app is built for speed, and the field tools provide an understanding of whether the experience is fast?

Leave a Reply

You can use these tags: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>
And here's a tool to convert HTML entities