Web Performance Calendar

The speed geek's favorite time of year
2020 Edition
ABOUT THE AUTHOR
Fabian Krumbholz photo

Fabian Krumbholz (@fabkru) is working as a Web Performance Consultant at Netcentric, a Cognizant Digital Business. He supports international brands to manage and improve their web performance.

He also curates a web performance resource list on GitHub with more than 500 videos, articles, courses, and events.

Photo of a dandelion
Photo by Coley Christine

Working for a company that builds websites for big international brands; these are the challenges I have to deal with as a web performance consultant:

  • Hundreds of people can impact web performance with their decisions — often without knowing it.
  • Stakeholders believe web performance is a technical problem that we can solve in the last sprint.
  • An external design agency delivered the final design without having web performance in mind.
  • Marketing teams worldwide have access to a Tag Manager and can launch third-parties and code anytime they want without consulting developers.
  • A CMS platform gives the editors the maximum flexibility in media formats, content order, and page length.
  • Depending on the user consent granted, the loaded third-parties can differ a lot.
  • Long release cycles and a complex build system makes it hard to test hypotheses quickly.

My goal is to improve the user experience and the business outcome day by day. I am still looking for a tool that tells me what to focus on today to make the most significant impact. As this is the perfect time of the year to create a wish list, here is mine:

Predict Web Performance as early as possible

A lot of the web performance problems get manifested in the design phase of the project. Our clients could save a lot of time and money if there were tools for designers that predict web performance issues based on a page design:

  • Media elements used above the fold
  • Detecting features like sliders or media galleries
  • Size of the largest element (LCP)
  • Number of used fonts and font styles
  • Length of the page

It would be great to have plugins for the standard design tools which can predict web performance. Most likely, this won’t be very accurate, but it would be great to remind designers to think about progressive enhancement and to avoid common web performance pitfalls.

Don’t let me do the hard work

It is hard to decide what I should focus on today to impact user experience and business outcome significantly. It would be great to have a dashboard providing the following data:

  • Which pages have problems?
  • How many users are affected?
  • What is the predicted business impact?
  • What is the root cause? Content? Code? Network? Third-party?
  • Which users are affected (device types, viewports, network types, consent status, browser versions, regions, browser settings like save-data, first vs. repeated visits)?
  • Who should be fixing it?
  • Make it easy to run synthetic tests for affected pages to gather more data.

Make data actionable

From my experience, people hate to learn and use just another tool. It would be much more effective if the web performance tool focuses on finding issues and their root cause. The best place to present the data are the tools people use to fix the problems:

  • If the web performance drops after a publishing, mark the affected page inside the CMS and alert the editor in charge.
  • If the web performance of a site drops after a deployment, inform the release manager.
  • If the web performance drops after a third-party change, mark the service inside the tag manager and alert the responsible person.
  • If there is a web performance drop for a specific region, create a ticket for the operation team.

Bringing the data to the people in a way they understand helps to fix problems much faster.

Protect the environment

Computing power, network traffic, and data storage have a tremendous impact on our environment. According to an eye-opening talk by Gerry McGovern:

  • 90% of data never accessed three months after it’s stored (Tech Target)
  • 80% of digital data never accessed (Active Archive Alliance)
  • 90% of data not analyzed (Lucidworks)
  • 90% of data never analyzed (IDC)

How can we learn everything we need to improve the user experience with the least amount of data collected, processed, and stored?

I like the Core Web Vitals. But why bother calculating the 75th percentile? Focusing on the pages with a poor user experience would be much more effective, and we would only have to collect and store a fraction of the data.

I don’t see a good reason to keep the data going back more than a month in the past. It would be great to offer settings to adjust the data collected, and the time data is stored.

I am curious what your wishes for the next-generation web performance tools are. Please share them in the comments.