Probably many – if not most – of us speed freaks work for big companies that are optimizing top 1,000 websites. Some companies have the budgets to hire people who work on WPO full time. Those companies are able to do that thanks to huge revenues – otherwise it would not be financially justifiable. In other words: they’re probably a top 1,000 website. The percentages of page speed improvements those people bring must translate into sufficient additional revenue to cover their wages. Other companies have people who get to spend part of their time on WPO, and maybe pay for the services of one of the several WPO companies out there. But even in this case, a sizeable revenue is a requirement to justify it.
So what about the millions of other websites?
Think tiny budgets. Shared hosting. No
mod_pagespeed to be seen. The ‘S’ in ‘SME’s. The small non-profits. How can they make their websites faster?
Well, a few companies are so big that they can even hire people to focus on the big leaps forward in web technology that will make the entire web faster (think better browsers and HTTP 2.0/SPDY). Usually that’s the case because the long-term success of those companies depends on the web, so they have a vested interest in an even more omnipresent web. It’s great. But it’s not enough: there’s only so much that better technology can do. A poorly implemented website with lots of large resources per page will still be slow.
So how can we make the majority of the web faster?
I believe we can do that by making the most popular CMSes faster by default.
The fat head is trampling the long tail
Increasingly, the big few own the attention of us internet users. The long tail receives fewer visitors.
Centralization. Silos. The big few are extremely successful because they have a better UX (amongst other reasons). Perceived simplicity & perceived speed are essential elements of that better UX.
If we want to protect the web’s diversity, the long tail must receive more attention again. Browsing the wider web feeling fast & fun rather than slow & annoying, to avoid a future where people spend (almost) all their time in the few silos they know.
Not just for the ideological idea of a free web with diverse opinions, but also to avoid monopolies from owning the web. Monopolies are bad for all of us for many reasons; two reasons are reduced innovation and higher risk of police state-like measures (such as censorship1 and eavesdropping).
Fast CMSes suited for various use cases – from simple blogging to complex data modeling – are essential for the success of the long tail.
Making WordPress, Drupal and Joomla fast out of the box
They all have two things in common:
- they’re free and open-source software
- they “enable non-developers to do developer-like things” (let’s call them “site builders”2)
If we can make these three open source systems faster, then we make the entire web faster!3
All three of them run fine on shared hosting. They can all be much faster when running on an optimized stack (think Varnish, memcached and so on). But the point is to make them as fast as possible on shared hosting.
In my own job, I spend a significant part of my time on making Drupal 8 (the next version of Drupal) being more performant out of the box. A few of those improvements are:
- Support for responsive images.
- Partial page caching, which results in much faster serving of HTML of personalized pages for authenticated users.
Hopefully, Drupal 8 can add more improvements still. And Drupal already had many performance-related features. I’m sure WordPress and Joomla do, too.
But what matters most is whether these performance features are enabled out of the box: many site builders will not be knowledgeable enough to be aware of these features nor their impact. So we should go the extra mile – and maybe add additional code complexity – to ensure we can enable performance features out of the box.
- A huge part of the web would be a lot faster if WordPress shipped with static file page caching and asset minification and object caching built in and enabled out of the box.
- shipping Joomla with an AJAXy admin UI, asset aggregation, asset dependency management, automated object cache clearing and pagecaching – would have a big impact, if enabled out of the box.
Overall, CMS developers should stop blaming the site builder for setting things up suboptimally and instead blame themselves!
CMSes should be fast by default. And whenever a site builder is about to configure something that would negatively impact perceived performance, then the user should be made aware – or better yet: remove that choice from the UI altogether and make it only overridable through code (or using a CMS extension).
Content strategy affects performance, too
All of the above is about performance-related features. But not every performance problem can be solved with better algorithms or smarter caching.
Content strategy is crucial for a fast site. A highly optimized responsive website that loads blazingly fast can load excruciatingly slow if the content it needs to show includes several multi-hundred-KiB images. Sadly, in the current state of the web, that is the rule, not the exception. CMSes should protect its users against that (for example by automatically losslessly optimizing images on output while retaining the originals), or, if unable to protect, educate its users.
If the site builder decides to add all sorts of social media widgets (think Twitter, Facebook, and Google+), and perhaps several tracking analytics scripts… then inevitably a page takes longer to load.
So another part of the puzzle is education. To my knowledge, no CMS even attempts to educate its users.
It’s great that many important websites with millions and billions of visitors are being sped up so much. But let’s not forget about the millions of other websites. As Stoyan Stefanov so elegantly & empirically demonstrated: users are frustrated by the entire web being slow, not just the web giants.
Do it to further increase the success of open source. Or to reduce the influence of monopolies. Or to make more money. Or to stand up against cultural imperialism. Or just because you selfishly want the web to be faster because you’re impatient.
Let’s make the entire web faster.
One must think in a global context, because the web is global. Hence silos are typically global, but they’re usually also regulated in a single country. For example: Facebook is an American company. In the U.S.A.’s popular culture (e.g. movies), extreme violence is acceptable, but deity forbid a nipple is seen for a fraction of a second. This particular cultural view is then also applied by American companies: see “Facebook’s nudity and violence guidelines are laid bare”:
“Nipples are rude but crushed limbs are OK“.This is considered “cultural censorship” or “cultural imperialism”. In other countries, the values may be different. For example, in the case of violence vs. nipples, Sweden has the opposite stance on what’s acceptable. ↩
Sitebuilders are people who are not necessarily programmers, but can still build a rich, complex website. Zach Chandler from Stanford University has aptly described their importance:
“The reason why the site builder is so important is that often they are experts in something else, subject matter experts that can use Drupal to model things that exist in their discipline.”They can do that thanks to:
- data modeling and query builder UIs
- an ecosystem of extensions for the basic CMS
Making perceived performance part of the features they compete on would be interesting, but they each have different focuses so in reality there’s little competition: they serve different use cases & markets. ↩