Table Of Contents Introduction Methodology Audit Boomerang Lifecycle Loader Snippet mPulse CDN boomerang.js size boomerang.js Parse Time boomerang.js init() config.json (mPulse) Work at onload The Beacon Work beyond onload Work at Unload TL;DR Summary Why did we write this article? 1. Introduction Boomerang is an open-source JavaScript library that measures the page load experience of […]
Last year I published an article introducing the concept of a service worker caching as a web performance optimization technique. This year will expand upon how to use service worker cache by covering cache invalidation. Invalidation should be applied properly to ensure your websites work off-line, load instantly and also include the freshest content possible […]
Another great year for this here little performance calendar. We were so lucky to have excellent contributions and, once again, went over the 24 article count expected of the “advent” format. And I could afford to say no to republications. I mean all authors are always free to republish their articles on their blogs or […]
As we all adventure around this space that we call the Internet, consuming content is often on our minds. Naturally with the vast amount of data, filtering out what’s not interesting is a huge time saver. In order to help you find your ideal hotel at the best price, trivago’s filters are one of the […]
Migration scrips. We’ve all had to write them. They are the background, throwaway scripts that don’t have any performance requirements. Who cares if the script you planned to run overnight takes an extra hour to finish, especially since it doesn’t directly impact users? And generally all of that is true. Of course, when the situation […]
We are talking a lot about performance, how it can be improved, which tools to use for performance improvements but less about how to keep reached performance on a proper level. So, let’s take a look at tools which can help you to do so. Size Limit ai/size-limit Size Limit is a tool to prevent […]
[TL:DR]: This post quickly gives a way to measure the “ad weight” rather than “page weight” which is well-known. This is important consideration, given that it represents the bytes attributable to revenue. Background & Motivation The web performance community knows a lot of page weight with a lot of tooling around it. For media sites […]
We are in the era where everyone involved in SDLC understands their role in building high performance system. It’s no more the core responsibility of Performance Engineering team in silo to ensure the system performance is assessed & certified few days before production move. Actually, the onset of Agile & DevOps development practices has brought […]
Tinder recently swiped right on the web. Their new responsive Progressive Web App – Tinder Online – is available to 100% of users on desktop and mobile, employing techniques for JavaScript performance optimization, Service Workers for network resilience and Push Notifications for chat engagement. Today we’ll walk through some of their performance learnings. Journey to […]
As more and more customers use high bandwidth networks, video has become the norm on the web. Social media, websites and (of course) streaming services like YouTube and Netflix all stream right onto your phone. Research has shown video enhances customer engagement, so we should expect that the growth of video on the web and […]
I guess web fonts are a big thing these days. Most of us know how know to write @font-face declarations by now. You know all about WOFF and WOFF2, possibly a bit about subsetting, or maybe even font-display for more accessible rendering of text. Hell, maybe you just grab a <link> tag from Google Fonts […]
The old bottleneck: Network In the old days, delivering a fast user experience depended primarily on download speed. One reason why the network was the main bottleneck back then is that JavaScript and CSS weren’t used as much as they are now, so CPU wasn’t a critical factor. According to the HTTP Archive, the top 1000 […]
Recent years have seen a drastic increase in websites migrating to HTTPS. There are many benefits that come along with this migration. One such benefit is access to modern technologies that improve the performance and experience of end users. At eBay, when we started migrating pages to HTTPS, we also started looking into how we […]
If you’ve never heard about Memcached, it is simply a high-performance, distributed memory caching system which uses a key-value store for strings and objects. Usually, it serves for saving data originally retrieved from a database or external services. As simple as it is, it can improve the performance of your website quite a bit. The […]
Webfonts are one of the very first things we add to our sites—whether it be for individuality or stylistic reasons. Unfortunately, they are also infamous for making our sites painfully slow and frustrating to read. The end result? Frustrated users and lots of lost business opportunities. Thankfully they’ve been given lots of attention the last […]
Introducing rUXt: Visualizing Real User Experience Data for 1.2 million Websites
by Inian ParameshwaranHTTP Archive provides an amazing trove of data for synthetic monitoring data for the top Alexa websites. Analyzing this data has led to ton of key performance insights over the years. However, this does not capture the diversity of the users across the web – which is better analysed by looking at Real User Monitoring […]
This post is about the DOM2AFrame proof-of-concept (and extremely non-production-ready) library, which transcodes typical HTML/CSS webpages to WebVR compatible UIs on-the-fly, while maintaining support for full interaction and animation at good enough™ framerates. This fun project has helped us get deeper insights in CSS handling, the browser’s rendering pipeline, supported and missing JavaScript APIs and […]
There are many case studies documenting how web performance and business metrics like conversion rates and ad revenue are correlated. But when a website is slow, those lost conversions and ad dollars don’t just evaporate — they go to a competitor. To get a sense of this risk, competitive analysis is the practice of benchmarking […]
Actual Input Latency: cross-browser measurement and the Hasal testing framework
by Bobby Chien, Fu-Hung Yen, Mike Lien, Shako Ho, Walter ChenThis is a story about an engineering team at Mozilla, based in Taipei, that was tasked with measuring performance and solving some specific performance bottlenecks in Firefox. It is also a story about user-reported performance issues that were turned into actionable insights. It is the story of how we developed Hasal, a framework for testing […]
One can easily find any number of blog posts that describe how to improve web and cloud application scalability. Almost to a tee, however, the information provided tends to constitute qualitative actions rather than quantitative analysis. In order to decide which recipes lead to the best results in practice, some kind of cost-benefit analysis is […]
As a WebPerformance consultant I tell my clients that a faster webpage will give them an edge on their competition. Lower bounce rate. Better conversion rate. Better everything. A couple of years ago I started an experiment to proof it on a green field. In January 2014 I started mehr-schulferien.de which is a German webpage […]
We live in very exciting times from the performance point of view. The complexity and scale of problems we are trying to solve skyrocket in almost any area (starting from IoT, big data, AI, etc.), each bringing a new level of performance challenges. But we are also getting opportunities and technologies to address these challenges […]
Caching assets in the browser is the most common and most obvious way to improve front end performance. But at some point every developer accidentally makes a bad release of an asset with a long cache lifetime. There is a way back! Here’s how to throw the kill-switch. If you’re a web developer, you, like […]
Today, I would like to share with you some HTTP headers that are not fully implemented or specified but could lead to a simpler CDN configuration and better performance. This post will cover: how Client Hints request header and Key response header can simplify adaptive design how Server Timing can simplify CDN monitoring Simplify adaptive […]
Available since IE9, DOM Ranges have been mostly advocated only as a way to select some text on the document, eventually copy it, or pollute users selections with extra content for social sharing. These use cases are great and useful … but also half of the story. What is a Range ? If we know […]
Speed is a major contributor to user experience on modern web sites. It is important to pay attention not only to technologies that said experiences are build with, but to the way they are designed as well. Proper speed design is a collaboration between product managers, UI designers and developers as all the aspects of […]
tl;dr:Image placeholders in SVG are ready for prime time thanks to browser support and good rendering performance. By automating SVG shape creation that mimics main features visible inside an image and compressing the result appropriately, we can achieve SVG-based image placeholders weighing in at only ~400 bytes. SQIP is a tool to make this process […]
tl;dr GIFs are awesome but terrible for quality and performance Replacing GIFs with <video> is better but has perf. drawbacks: not preloaded, uses range requests Now you can <img src=".mp4">s in Safari Technology Preview Early results show mp4s in <img> tags display 20x faster and decode 7x faster than the GIF equivalent – in addition […]
At the Wikimedia Foundation we’ve been working on finding web performance regressions for a couple of years. We are slowly getting more confident in our metrics and find regressions easier. Today I wanna show you how we automated finding regressions in production using open source tools. Back in the days – RUM only When we […]
If you visit sites such as Facebook, Pinterest or Medium regularly, you may have noticed that the first time you load the page, you’ll see a page with low quality or even blurry images. Then as the page continues to load, the blurry / low quality images are then replaced with the full quality versions. […]
Sometimes it feels like we keep inventing new metrics just to mess with people and move the goalposts but for the most part new performance metrics come along as our understanding of the user experience and the landscape of the web evolves.