At the FT we unfortunately have to support old browsers. People who work in large banks are an important part of our audience, and their companies are unbelievably slow to upgrade their workstations. In fact, our colleagues in Beijing who publish FT Chinese also have to deal with a subbornly significant legion of IE6 users.  Yes, IE6.
MSIE traffic to FT Chinese, by version. A reminder from our Beijing colleagues of the need for good polyfills. pic.twitter.com/gNKc2JqiGQ
— FT Labs (@FTLabs) October 8, 2014
How significant? I looked into how much ad revenue we make just from IE < 9, and, well, it’s a lot.
But the majority of users, both FT readers and internet users generally, are on one of the latest two versions of IE, Chrome or Firefox, all of which are easy to support. Â And what’s more, we are all now almostly obsessively focused on performance. Â We don’t want to slow down the experience for the majority who are using up to date browsers, and we want more time to spend on those browsers to make it even faster. Â Adding compatibility code for legacy browsers both slows down the browsers which don’t need it, and is a waste of time we could be spending on precious millisecond-shaving performance work.
To address this concern, along with Jonathan Neal (co-author of HTML5Shiv), some lovely people at Yahoo, and with support from Fastly, we created the polyfill service.
Interesting. How does it work?
- You add a script tag to your page prior to your own script
- It makes a request to our (Fastly-hosted) CDN application, which looks up your user’s browser `User-Agent` and packages a set of polyfills suitable for bringing that browser out of the stone age
- The bundle is minified, gzipped, and served to your user over SSL with best practice cache-control policies
- Profit
And there’s no dubious gap before step 4 either, because you literally stop worrying about why your application doesn’t work in <insert name of irritatingly popular legacy browser>, and instead spend your time making your app more excellent. Â Amongst other things, the polyfill service will automatically give you styleable HTML5 elements, querySelector, classList, and most ES5 methods in browsers as far back as IE6. Â All without delivering any pointless code to the newest browsers.
Here’s a size comparison between the script bundles that we serve (based on the default feature list) for the last six versions of Internet Explorer:
Browser | Raw size | Minifed | GZipped |
---|---|---|---|
IE6 | 41.3KB | 20.7KB | 6.8KB |
IE7 | 41.3KB | 20.7KB | 6.8KB |
IE8 | 28.9KB | 14.5KB | 4.6KB |
IE9 | 8.5KB | 4.0KB | 1.7KB |
IE10 | 3.0KB | 1.3KB | 0.6KB |
IE11 | 2.5KB | 1.1KB | 0.5KB |
Some of the larger polyfills can increase the size dramatically, but again we only serve them to browsers that need them, and we don’t include the largest ones by default so you can choose to include them when you need those features in your application.
But… an extra request!
Yes there is that. But if you’re especially concerned about it, you can run the polyfill service yourself as a library in your Node.JS code, or run it as an HTTP API to replicate the public instance, and we would love to see backend implementations in other languages. Even if we don’t have an implementation for your language yet, just running the service on the same domain as your application negates the performance impact of multiple requests if your servers support HTTP/2.
In any case, provided your own JavaScript is not blocking the page render, there’s no reason the polyfill request has to either. And we should be much more concerned with the critical path to first render, rather than the complete page load time.
But… user agent sniffing!
Server-side device detection gets an extremely bad rap, unfairly so. That’s because it’s easy to do bad things with it, people did those bad things, and that’s why a typical user agent string is now more ambiguous. But applied correctly, and for the right reason, there’s nothing wrong with this practice, and pretty much every top 50 website in the world does it.
The bad approach is to do some naive pattern matching, like /Windows 9/
to match Windows 95 and 98, and then not update it by the time Windows 9 is released 20 years later (this is literally the reason Microsoft skipped straight to Windows 10). Even worse if that pattern matching is exclusive, ie. you will only allow your most advanced features to be used if the pattern matches, so patterns like /chrome|iphone/
cause vendors like Mozilla and Microsoft to start including those words in their own User Agents, and chaos ensues.
We use the npm useragent module, and don’t write any of our own patterns. We also use UA sniffing only to enhance browsers, not to exclude them. This use is completely in line with what the W3C specification envisages the header being used for:
The User-Agent request-header field contains information about the user agent originating the request. This is for … automated recognition of user agents for the sake of tailoring responses to avoid particular user agent limitations.
OK, I’m sold. How can I help?
Try out the service on your own site and raise any issues you have on our GitHub repo. You could even consider fixing them and making a pull request, or adding the polyfills you need for your sites. We have detailed contribution guidelines, and some amazing sponsored rewards from supporting organisations for fixes focusing on particular areas of the web platform, starting with premium access to the FT.