Web Performance Calendar

The speed geek's favorite time of year
2022 Edition

Stoyan (@stoyanstefanov) is an engineer on the WebPageTest.org team, formerly at Facebooks and Yahoo!, writer ("JavaScript Patterns", "React: Up and Running"), speaker (JSConf, Velocity, Perf.now()), toolmaker (Smush.it, YSlow 2.0) and a guitar hero wannabe.

Looking back at when this here little calendar started (Hello 2009!) things are so infinitely better now. Browsers are anything but the black boxes they used to be. Boxes that you needed to poke about and see what comes back. Whether you’ve been living under a rock (haven’t we all, hello 2020!) or you’re on top of all the new APIs and even inventing them, I think it’s nice to spend a moment in appreciation of where we’re at now. So pour a cup of ${beverage}, settle by the fire, hoist your feet and let’s enumerate what’s new-ish and cool in the world of Web Performance APIs.

Render blocking status

Starting with render blocking status reminds us to appreciate all the Performance APIs that we can avail ourselves to these days. Can you imagine a world without Long Tasks API? No Resource Timing? And no windows.performance too?! One shudders in horror at the thought…

So say hello to the new property on the block: the renderBlockingStatus property of PerformanceResourceTiming objects. It can tell you if a resource (looking at you CSS!) is blocking the rendering, meaning the browser has to wait for the resource before it can render stuff on the page.

So cool, let’s keep going, in no particular order…

Offscreen Canvas

Offscreen Canvas lets you do canvas operations off the main thread. The main thread is the one, the main one. The more you can stay out of it, the better, less glitchy, non-janky the experience. It’s available in Web Workers too, so a great opportunity to perform expensive operations separately and only show the new canvas in all its glory when you’re ready. Basically whenever you’re about to use a canvas, ask yourself: should we take this offline?


Speaking of off, here a more niche-y one that has been around but it deserves an honorable mention: OfflineAudioContext. It allows you to render audio faster than realtime and not involving the audio hardware.

Async image decoding

“Off the main thread!” is our mantra. Asynchronous image decoding is another one of these APIs. You can tell the browser, that this image is not that important right now and its loading can be paralleled along with other stuff. You can use it as an HTML attribute:

<img src="omg.jpg" decoding="async">

Or as JavaScript API:

const i = new Image();
i.decoding = 'async';
i.src = 'omg.jpg';

Look at that second example. When is really the image the most important thing? If so sync decoding is also an option. But I’d guess a large % of use cases would do better as async.

Image decode()

Ever done something like this:

const i = new Image();
i.src = "omg.jpg";
i.onload = () => {
  // yes! DOM-insertion time
i.onerror = () => {
  // doom and gloom

If so, well, stop it, look into the decode() method of image objects.

const i = new Image();
i.src = "omg.jpg";
i.decode().then(() => {
  // yes! DOM-insertion time
}).catch(() => {
  // doom and gloom

How is it different? Welp, the old school way fetches the image over the wire and as you insert it in the DOM, the decoding and painting happen on the main thread. Frames are dropped, people suffer.

With decode() you get parallelization, and also control. If you’re one of these peeps (cough, React) who likes to batch updates to the DOM, this is gold. You can use one requestAnimationFrame tick and do all updates to the DOM (including any already decoded and ready-to-go images) all in one go. What a time to be alive!


Ah, the old lazy loading! So much JavaScript code has been written to load images lazily. And, imagine that, even before IntersectionObserver was a thing. (Add IntersectionObserver to the list of appreciation topics).

Now you don’t need any of that. Add loading="lazy" to any image or iframe and forget about it. Well, not any image. Images that have a good chance of being above the fold should still skip the laziness and load ASAP.

Story time: I’m happy to have been a small part of the loading=lazy adoption on the web. As I was leaving Facebook, the last thing I did was push the update to all social plugins and the documentation to support lazy loading for all social plugins. (Social plugins are iframes embedded in the host page, example docs. The default is false (2 years ago it was early days for the loading attribute), hopefully the default will be true soon enough.


As you probably know, the browsers have certain heuristics as to which resources (js, css, images, fonts, etc) are more important than others. E.g. blocking resources such as CSS should have a high priority, so download these first. Images are not blocking and “yeah, whenever you get around to it” type of deal. But you know all too well that some “hero” images are ever so important. Well now with fetchpriority="high" you can hint the browser to up the priority on those. Cool, huh?

103 Early Hints

103 Early Hints HTTP Status code is for these special times when your page gets a request and needs to do some work (think fetch stuff from a database). But the page also knows that no matter what the DB returns, there’s going to be some CSS-ing and JS-ing to do and maybe a logo to show. I say special times but they’re not that special, in fact, unless you have a static site, this is pretty much the default case.

So now while the HTML response is still being assembled dynamically, your server can send a 103 response and let the browser get started on fetching all the static stuff.

Achieving this has been possible for quite a while (e.g using flushing, hello 2009!) but it has never been this convenient and hack-free.

Can I Use (dot com) lists basically just Chrome but that’s not completely true. Just a few days ago Barry Pollard sent a PR to update the stats to include Edge too, after Paul Calvano double-checked it during a discussion on the Web Performance Slack.

Speaking of Web Performance Slack, if you’re reading this, you should definitely join this friendly place for web performance discussions. I just did a couple of weeks ago and it’s lovely. For an invite, just ask wherever you think performance geeks gather, e.g. webperf.social on Mastodon or right here in a comment.

Speculation Rules API

Speaking of Barry Pollard, here’s a bit of a speculative appreciation to wrap us up. Speculative appreciation, because it’s not out yet and not yet on Can I Use (dot com). And also because this up-and-coming API is called “Speculation Rules API”. The aforementioned Barry wrote a post right here to whet your appetite. It’s all about pre-rendering. E.g. you know as at the user is typing a username/pass, they have some chance of getting it right. You can use some JS or HTML to preload static resources that will be needed on the next page. But now with the Speculation Rules API you can also prerender (render in the background) complete pages, not just the static resources needed by these pages. Hotness!


So this holiday season, let’s raise our cups of beverage_of_choice ?? mulled_wine and appreciate how far we’ve come and how much more is in front of us. Here’s to a faster 2023!

Aaaand it won’t be a 2022 calendar without a quote of a ChatGPT creation: an ode to web performance.

From frontend to backend, we’ll explore
The many ways to speed up more
We’ll optimize and test and measure
To make the web a faster treasure

On this day of perf delight
We’ll learn and share with all our might
To make the web a faster place
And put a smile on every face