Web Performance Calendar

The speed geek's favorite time of year
2022 Edition
ABOUT THE AUTHOR
Barry Pollard

Barry Pollard (@tunetheweb, @tunetheweb@webperf.social) is a Web Performance Developer Advocate in the Google Chrome team, working on Core Web Vitals and the Chrome User Experience Report (CrUX), and is one of the maintainers of the HTTP Archive. He's also the author of HTTP/2 in Action from Manning Publications.

Fast page loading is so last year… you should be thinking about instant page loads!

Introduction

We web performance aficionados spend a lot of time optimising web page loads and staring at waterfalls and that’s a good thing, and we should continue it. But there is only so much we can do with the nature of the web – it’s a request/response medium served over a global network. As such, there is an inherent delay to web browsing compared to reading a physical newspaper or book – no matter how fast we make them.

Core Web Vitals uses LCP as its loading metric, which has a 2.5 second “good” threshold. This is, in part, set to that threshold to ensure the metric is achievable, when ideally the threshold would perhaps be closer to 1 second, or even lower. The 2.5 second limit is already proving a hard limit for many websites to meet. LCP is the Core Web Vital that most sites fail – even when including the newer INP metric that has many people excited as a better interactivity metric than FID.

To give an experience where the load is not just “acceptable”, but completely unnoticeable, we need to look beyond just optimising our loading speeds and think about how we can completely hide that load delay from the user.

But how can that be possible? I already said the delay was an inherent part of web browsing? Well there are a couple of technologies that make the seemingly impossible possible. I’m pretty excited by them and think more website owners and developers need to both be aware of them, and ensure their websites can use them.

Instant pages with the back/forward cache

The back/forward cache (or bfcache for short) is a browser optimization that stores the fully loaded page (including the JavaScript heap) in memory for a period after a user navigates away. That way if the user goes back to a page it can be restored instantly. And the same when going forward again after going back.

So that’s one way to get these sought-after instant page loads. OK, but that’s cheating right? It’s not a “real” page load after all. Well, as far as the Core Web Vitals initiative is concerned, this is as much a page navigation as any other navigation, and when this was released in Chrome (just around the time of last year’s Web Performance calendar), it made a noticable change in sites passing Core Web Vitals – particularly for CLS, where mobile sites passing increased by 6 percentage points (from 67% to 73%). Internal measurements from Chrome suggests 20% of mobile navigations are back/forwards navigation (10% on desktop) so that’s a sizable chunk of navigations that could make a dent in your users experience and your site’s Core Web Vitals.

As an aside, I believe it should be considered a full page navigation everywhere as I fail to see the difference between a back navigation when the bfcahce is not used, with one where it is used. But that’s a bit of a sidetrack from the topic here…

Chrome was particularly late to the wonders of the bfcache – Safari and Firefox have had them for years! However, the Chrome team did a great job of 1) documenting it (full disclosure I help maintain this document now after joining the team earlier this year), 2) chasing down third-parties that prevented it being used and 3) adding tooling to allow you to see where it cannot be used.

The bfcache is one of those browser optimisations that should just work. Sites don’t need to do anything to enable it, but they can do things to (usually inadvertently) disable it. The two biggest reasons the bfcache would not be used are using unload handlers, and setting cache-control: no-store on the main document response.

Unload handlers in theory run as the page is being unloaded. Which can seem like a useful way to send analytics beacons or perform other clean ups. However, it is completely unreliable and, more often than not, will fail to run at all. This is particularly true on mobile, where pages are often backgrounded (as users switch apps away from the browser), and then often killed silently in the background to conserve memory, and reloaded when the user returns to that browser tab.

When pages are put into the bfcache the page is not technically “unloaded”, and may be restored, or may be thrown away. So the browser is left with the choice of running those unload beacons and then restoring (which may be completely unexpected by the page), or ignoring the unload handlers, or just making the page ineligible for the bfcache. And different browsers make different choices here.

Firefox takes the cautious approach of making all pages that use this unload handlers ineligible for the bfcache (and similarly with the beforeunload handler, which is perhaps less risky to rerun since it is not guaranteed to be unique). Safari takes the complete opposite view and says that, since the unload handler is basically unreliable anyway, it will use the bfcache and not fire these when putting pages in there. Chrome follows Safari on mobile, but not on desktop where it does prevent pages using unload handlers from being able to use unload handlers (since these were traditionally at least a little more reliable there, and desktops web apps may have more complex needs).

The basic point is that unload handlers are unreliable AND cost you performance in making you ineligible for the bfcache on some platforms. Stop using them. And chase any third parties on your site that are still using them. For now, the page visibility API is the best alternative to act upon when the page is backgrounded (possibly never to be foregrounded again!). In future, the Pending Beacon API, is a new initiative from the Chrome team to address the shortfalls of the unload beacon by moving the beaconing into the browser’s control instead of the web page’s JavaScript (which can only run when the page is foregrounded). Philip Walton has worked on a polyfill to allow you to more easily use this while falling back to the older Page Visibility API for other browsers.

The other issue is setting cache-control: no-store on the main document response. Currently this makes a page ineligible for bfcache (there are discussions within the Chrome team as to whether we should change this, since it’s an HTTP Cache directive, rather than a memory cache directive). You should really only use no-store for sensitive pages, containing personal information. For pages that you just want to keep fresh (for example, a news outlet’s home page), cache-control: no-cache (or the equivalent cache-control: max-age=0), is sufficient to prevent use on page load/reload but also still allow the bfcache to still be used. And in my opinion, the vast majority of pages do not NEED a zero cache store time and even a small time would benefit users even when the bfcache is not used.

There are a myriad of other reasons why a page can become ineligible for the bfache, but those are the two main reasons. The Back-forward Cache testing tool in Chrome dev tools is a great way of checking for the common reasons, and we’re working on an equivalent Lighthouse test too.

Each page may be different though, and running all your pages through either of these tools is unrealistic. The NotRestoredReason API allows you to monitor this in the field to more easily identify reasons and it is currently in an Origin trial in Chrome allowing you to test it on your site.

Finally, there is a proposal to have an Unload Permission Policy, where a site can instruct Chrome not to run any unload handlers, and so prevent them being a reason for bfcache ineligibility. This can be helpful to prevent any third-parties or extensions adding these.

Bfcache navigations are some of the fastest possible navigations, often resulting in zero LCP, reduced CLS, and much better FID and INP. As I wrote before I truly believe it is a Performance Game Changer and “I honestly believe that sites that are ineligible for the Back/Forward Cache are giving up free web performance for their users, and making passing Core Web Vitals needlessly tough on themselves.” Make sure you’re not one of those sites!

Instant pages with prerender

The bfcache is great for page restores, but to restore a page, we need to have visited it first. Can we make those navigation “instant”. Well yes, thanks to the recently relaunched prerender functionality from Chrome. And I do mean recent as we just published documentation on this!

To summarise it, pages can use the Speculation Rules API by including a JSON script in the page, askling the browser to prerender a probable next navigation, or even a list of potable next navigations:

<script type="speculationrules">
{
  "prerender": [
    {
      "source": "list",
      "urls": ["next.html", "next2.html"]
    }
  ]
}
</script>

This is similar to the <link rel="prerender" href="/next-page/"> resource hint, which stopped doing a full prerender a few years ago for various reasons, and now does a NoState Prefetch instead (basically fetching the page and all the subresources seen in the HTML but not doing a full render of that page, nor execution of any JavaScript on that page).

A prerendered page is rendered in a hidden background renderer – ready to replace the contents of the current tab if the user navigates to it. If the page is fully prerendered than you get the sought after instant navigation:

Chrome measures the Core Web Vitals from page activation, rather than page load start time. This can result in a zero-LCP for future page navigations and again lower CLS, and reduced FID and INP.

Even if the page has not fully prerendered, having a head start on the navigation should result in a faster loading time. A partial prerender could happen if the Speculation Rules JSON is inserted or acted upon based on a user interacting with a link (e.g. on hover, or mouse down) and user immediately clicks on the link before the page is fully prerendered.

There are a few JavaScript libraries available that attempt to prefetch, or “prerender” (using the older NoState Prefetch which does not actually prerender). I hope these libraries will be upgraded to use the new Speculation Rules API to allow full prerender as an option.

Like many performance optimizations, prerender should be used in moderation. Prerendering pages uses up network, CPU and memory resources just like any other page load. Therefore pages should only be prerendered if there is a high likelihood that the user will visit that page. For some pages that may be known in advance (e.g. the headline article on a newspaper home page, or the “next” page in a series of articles). For others it may not be obvious and prerender either should not be used, or should only be used when the user indicates they will navigate to it as described above.

Prerendering is also a speculative hint to the browser, rather than a declarative instruction. As such, the browser may ignore prerender request, particularly when in SaveData mode, or when Chrome is already using considerable amounts of memory.

Prerender is in active development by the Chrome team. Over the next number of releases expect to see the capabilities of prerender increase, as well as developer tools support for prerendered processes. I’d encourage development teams to start using this and provide feedback.

When used correctly, the performance and UX benefits of prerender (like bfcache) are huge with instant page loads and (unlike bfcache) this can be used on pages that have not yet been visited yet on your site.

Instant pages with prerender from the omnibox

For privacy reasons, prerender can only be used on same-origin pages, though we are currently experimenting with same-site, cross-origin prerenders so sites on different subdomains of the same origin can be prerendered if those sites opt in.

This means you need to be on the site already to benefit from the Speculation Rules API. In addition, as it requires site owners to implement these changes, uptake may be low initially and take time to grow.

However, the Chrome team is also bringing the new prerender process to the Chrome omnibox as well. In certain cases, when Chrome has good confidence the URL will be visited, the browser will prerender the page as you type a URL – meaning sites can benefit from prerender without making any changes, or even being aware of this technology.

Chrome needs more than a 80% confidence ratio that a URL typed will be visited, based on your previous browsing history, and you can see your own personal predictions by entering chrome://predictors into the omnibox.

For more than a 50% confidence ratio, Chrome will preconnect to the domain, but not prefetch nor prerender the page. This can still have a small performance boost, but will not provide instant page loads.

You may not be aware your browser has been speeding up your navigation for some time – Steve Souders first wrote about it way back in 2014. It has been using NoState Prefetch for high confidence, but Chrome is rolling out full prerender from now so you should start to see these improvements to both your browser experience (if you’re a Chrome user), and to your site’s page loads over the next few weeks and months. I’m excited to see the impact of prerender on Core Web Vital pass rates.

Conclusion

Fast, responsive websites feel great, but we seem addicted to growing our websites as quickly to fill increased scope due to technology improvements. This isn’t all negative though – the web is becoming a richer, more interactive medium. We’re also able to do so much more on the web – we even have Photoshop on the web now!

Larger, more complex,sites are more difficult to make fast. And our websites are also being seen by a wider variety of people all over the world, through many devices and networks. It’s the nature of the web that these are completely out of our control, and while CDNs can definitely help reduce the latency, they still can’t compensate for slow networks and slow devices.

The three ways described here for “instant” page loads have the potential for drastically improving the user experience of the web. We’ve already seen this with the bfcache, and I’m hoping that prerender (particularly from the Chrome omnibox) will lead to another web-wide bump in Core Web Vital pass rates, and more importantly in user experience.

As site owners, we can make sure our sites (and the third-parties we load on them) do not trip us up, and prevent these optimisations from being used. There are various ways of checking this for bfcache (and more coming!), and we’re looking to learn from those and are considering similar checks for the new prerender to identify the cases where that is unable to be used.

More advanced sites should consider the Speculation Rules API so they can tell the browser which pages to prerender – when they have a high confidence they will be visited by the user.

I’d also strongly encourage site owners to measure these non-standard navigation, and measure how their Core Web Vitals differ on these navigation compared to standard page loads. I’d would love to hear about your experiences of them!