If you’re reading this, you’re probably the type who’s continuously looking for pragmatic, forward-thinking ways to make your site(s) faster. So when I read a guide a while back about a thing called differential serving by Phil Walton, I was intrigued. If you haven’t heard about this technique, it’s the idea that you can compile and serve two separate JavaScript bundles for your site:
- One bundle with all the Babel-fied transforms and polyfills that work on all browsers – which only get served to the legacy browsers that actually need them. This is the bundle you’re probably already generating.
- A second bundle with the same functionality as the first, but with little to no transforms or polyfills. This bundle is served only to modern browsers that can use them.
We use Babel to transform scripts so we can use them everywhere, but we do so at some peril. The extra code it adds – in most configurations – is often not necessary for users on modern browsers. With a bit of effort, build processes can be changed to reduce the amount of code we send to a large chunk of users on modern browsers, while maintaining compatibility for those lingering on legacy clients such as IE11. The aim of differential serving isn’t just to improve transfer time – thought it certainly can help in that department. It can also help reduce blocking of the main thread by reducing the amount of scripts the browser needs to process, which is a resource-intensive process.
In this guide, you’ll learn how you can set up differential serving in your build pipeline in 2019, from setting up Babel, to what tweaks you’ll need to make in webpack, as well as the benefits of doing all this work.
Setting up your Babel configurations
Outputting multiple builds of the same app involves a Babel configuration for each target. After all, multiple Babel configs in a single project aren’t uncommon. It’s usually done by placing each separate config object under an env
object key. Here’s what that looks like in a Babel config set up for differential serving:
// babel.config.js module.exports = { env: { // This is the config we'll use to generate bundles for legacy browsers. legacy: { presets: [ [ "@babel/preset-env", { modules: false, useBuiltIns: "entry", // This should reasonably target older browsers. targets: "> 0.25%, last 2 versions, Firefox ESR" } ] ], plugins: [ "@babel/plugin-transform-runtime", "@babel/plugin-syntax-dynamic-import" ] }, // This is the config we'll use to generate bundles for modern browsers. modern: { presets: [ [ "@babel/preset-env", { modules: false, targets: { // This will target browsers which support ES modules. esmodules: true } } ] ], plugins: [ "@babel/plugin-transform-runtime", "@babel/plugin-syntax-dynamic-import" ] } } };
You’ll note that there are two configurations: modern
and legacy
. These both control how each of the bundles gets transformed by Babel. Ironically, the tool that polyfills and adds unnecessary transforms to our code is the same tool we can use to ship less code. In Phil’s original article, he used Babel 6’s babel-preset-env
to achieve this. Now that Babel 7 is ready, we use @babel/preset-env
instead.
The first thing to note is that the options used in @babel/preset-env
are different in each config. For legacy
, we pass a browserslist query to the targets
option appropriate for legacy browsers. We also tell the preset to include polyfills from @babel/polyfill
with the useBuiltIns
option. Aside from presets, we include a couple of necessary plugins.
Note: useBuiltIns
accepts two values aside from false
. These values are "entry"
and "usage"
. The documentation does a good job of explaining how they differ, but it’s worth noting that "usage"
is experimental. It will often yield smaller bundles compared to "entry"
, but I’ve found I’ve needed to specify "entry"
in order for scripts to work in IE 11.
Things look largely the same for the modern
config, except the value of targets
is different. Rather than pass a browserslist query, we pass an object using the esmodules
option. When set to true
, @babel/preset-env
will use less transforms, because the preset targets browsers that natively support ES modules, async
/await
, and other modern features. The useBuiltIns
option is also dropped altogether, since none of the features used in the project need to be polyfilled. That said, your application may need some polyfills if you’re using bleeding edge features not well supported across even modern browsers. If your application breaks with this setup, set useBuiltIns
appropriately.
Configuring webpack for differential serving
webpack – and most other bundlers – will offer a feature called multi-compiler mode. This feature is essential for differential serving. Multi-compiler mode lets you pass an array of multiple configuration objects to spit out multiple sets of bundles:
// webpack.config.js modules.exports = [{ // Object config one }, { // Object config two }];
This is essential because we can pass two separate configurations which both use the same entry point. We can also adjust the rules in each configuration as appropriate.
This is easier said than done, though. webpack is terrifically complicated at times, and never more so when you’re dealing with multiple configurations. It’s not impossible, though, so let’s find out what it takes to get there.
Start with a common configuration
Because you’re doing separate builds of the same entry point, your configurations will have a lot in common. A common config is a convenient way to manage those similarities:
// webpack.config.js const commonConfig = { // `devMode` is the result of process.env.NODE_ENV !== "production" mode: devMode ? "development" : "production", entry: path.resolve(__dirname, "src", "index.js"), plugins: [ // Plugins common amongst both configurations ] };
From here, you can write your separate webpack configs and use spread syntax to merge the common configuration into each:
// webpack.config.js const legacyConfig = { name: "client-legacy", output: path.resolve(__dirname, "src", "index.js"), module: { rules: [ // Loaders go here... ] }, // Spread syntax merges commonConfig into this object. ...commonConfig }; const modernConfig = { name: "client-modern", // Note the use of the .mjs extension for the modern config. output: path.resolve(__dirname, "src", "index.mjs"), module: { rules: [ // Loaders go here... ] }, // Ditto. ...commonConfig }; // Slap 'em in there module.exports = [legacyConfig, modernConfig];
The takeaway here is you can minimize the lines of configuration you have to write if you combine what’s common between both configurations from a common object. From there, you just focus on the key differences between each target.
Managing both configurations
Now that you know how to manage what’s common between each configuration, you need to know how to manage what’s different. Managing loaders and plugins gets tricky when you’re compiling a common entry point to different targets. This is especially true if you’re handling more than just JavaScript assets. Here’s a bit of guidance I hope will help.
babel-loader
Arguably, the most common loader you’ll see in any webpack config is babel-loader
. For what we’re trying to achieve, you’ll need to use babel-loader
in both your modern and legacy config objects, albeit with slightly different configurations. The babel-loader
config will look like this for the legacy browser target:
// webpack.config.js const legacyConfig = { // ... module: { rules: [ { test: /\.js$/i, // Make sure you're bundling your vendor scripts to a separate chunk, // otherwise this exclude pattern may break your build on the client! exclude: /node_modules/i, use: { loader: "babel-loader", options: { envName: "legacy" // Points to env.legacy in babel.config.js } } }, // Other loaders... ] }, // ... };
In the modern browser target, the only differences are that we change the value of the test
regex to include the ES module file extension (.mjs
) in npm packages by changing it to /\.m?js$/i
. We also need to change the value of options.envName
to "modern"
. options.envName
points to separate configurations contained within the babel.config.js
example from earlier.
// webpack.config.js const modernConfig = { // ... module: { rules: [ { test: /\.m?js$/i, exclude: /node_modules/i, use: { loader: "babel-loader", options: { envName: "modern" // Points to env.modern in babel.config.js } } }, // Other loaders... ] }, // ... };
Other loaders and plugins
Depending on your project, you may have other loaders or plugins that handle asset types other than JavaScript. How you handle them for each browser target depends on your project’s needs, but here’s a bit of advice.
- You may not need to change anything with your other loaders. The thing to remember is that webpack doesn’t just manage JavaScript, it can also manage a whole bunch of other stuff. CSS, images, fonts, basically whatever you’ve installed a loader for. Because of this, it’s important that you output the same – or at least maintain the same references to – assets for each browser target.
- Some loaders allow you to disable file emission. This can be useful in differential serving builds. For example, say you use
file-loader
to handle importing non-JavaScript assets. In the config for modern browsers, you can tellfile-loader
to output files, whereas in the config for legacy browsers, you can specify emitFile: false to prevent files from being written to disk twice. This may help speed builds up a bit. null-loader
could also be potentially useful for controlling the loading and emission of files where multiple configurations are used.- Be careful with hash-versioned assets. Let’s say you use an image optimization loader (e.g.,
image-webpack-loader
) to optimize images. You may need to use that loader in both configurations, as one asset graph will contain references to unoptimized images, and the other will contain references to optimized images. Because the file contents are different for each build, the file hash will differ as well. The result is one set of users will get unoptimized image assets while the rest get optimized ones. - Plugins are a whole other ball of wax, and best guidance for using them in a differential serving setup varies. For example, if you’re using
copy-webpack-plugin
to copy miscellaneous stuff fromsrc
todist
, chances are good you’ll only need it in one configuration, not both. That said, having the same plugin in both configurations shouldn’t cause problems, but build speed could be affected. - If your loaders and plugin configuration starts to get a bit hairy, npm scripts can be a nice substitute. For simple projects, I’ll often install image optimization binaries locally to my project with npm (e.g.,
pngquant-bin
) and usenpx
in an npm script to do this work after a build has finished. This reduces clutter in my webpack config, which is always a welcome change. - If you’re using
assets-webpack-plugin
to generate an asset manifest for both builds, things can get complicated. You’ll need to create a single instance to pass into each configuration’s plugins array and follow this advice. In a project of mine, I useassets-webpack-plugin
in a Node script to inject references to scripts into generated HTML (more on that later).
The gist of these points is that you should maintain parallelism of asset references between builds, and generally avoid writing the same assets to disk multiple times. But, also do what’s reasonable and convenient in what’s likely already a complex build environment.
Managing your uglifier
Until recently, uglify-js was webpack’s default uglifier. This just changed in version 4.26.0 when terser became the default. If you’re using version 4.26.0 or better, good news – you’re all set and you don’t need to anything else for your build to work!
If you’re on an earlier version, though, terser is not the default uglifier. uglify-js is, and you’ll need to use terser in your modern config. This is because uglify-js can’t understand JavaScript syntax beyond ES5. It chokes on stuff like arrow functions, async
/await
, and so on.
For your legacy config, you don’t need to do anything as it should already build fine. For your modern config, though, you’ll need to npm install terser-webpack-plugin
for your project. Then you’ll need to add terser-webpack-plugin
to the optimization.minimizer
array:
// webpack.config.js const TerserWebpackPlugin = require("terser-webpack-plugin"); const modernConfig = { // ... optimization: { minimizer: [ new TerserWebpackPlugin({ test: /\.m?js$/i, // If you're outputting .mjs files, you need this! terserOptions: { ecma: 6 // This can be set to 7 or 8, too. } }) ] } // ... };
In our modern config, we output files with a .mjs
extension. In order for terser to recognize and modify those files, we need to modify the test
regex accordingly. We also need to set the ecma
option to 6
(although 7
or 8
are also valid values).
Injecting script references into HTML
You may use html-webpack-plugin
to handle generation of HTML files for your app shell markup, and for good reason. It’s a slick plugin that handles a lot of the busywork around injecting <link>
and <script>
tags into your HTML templates. Unfortunately, it can’t inject <script>
tags in a way that supports differential serving. It’s up to you to figure out how to get those script references into your HTML file(s).
Fortunately, all it takes is a teeny bit of ingenuity to get around this. For a project I use differential serving on, I use assets-webpack-plugin
to gather what assets have been generated by webpack like so:
// webpack.config.js const AssetsWebpackPlugin = require("assets-webpack-plugin"); const assetsWebpackPluginInstance = new AssetsWebpackPlugin({ filename: "assets.json", update: true, fileTypes: [ "js", "mjs" ] });
From here, I add this instance of assets-webpack-plugin
to the plugins
array in both my legacy and modern configs. I’ve configured the plugin instance with the following options to best fit my project:
filename
dictates where the assets JSON file should be output to.- I set
update
totrue
, which tells the plugin to reuse the same assets JSON file for both the modern and legacy configs. - I update
fileTypes
to ensure it includes the.mjs
files generated by the modern fork of my config intoassets.json
.
From here is where it gets a touch hacky. In order to get the <script>
references using the desired pattern into my HTML files, I use an npm script that runs after webpack is done. This script reads the assets.json
file generated by assets-webpack-plugin
and plops in the proper markup.
Hopefully html-webpack-plugin
comes to support this natively, because I would personally prefer not to use this approach. That said, it gets me there. You may have to devise a solution of your own in the interim.
Is it even worth it?
You’ve read well past the halfpoint of this article, and I’m sure the question remains: Is this technique worth it? My answer is a resounding yes. I’m using differential serving on my site, and I think the benefits speak for themselves:
All JS assets | gzip (level 9) | Brotli (level 11) | |
---|---|---|---|
Legacy | 112.14 KB | 38.6 KB | 33.58 KB |
Modern | 34.23 KB | 12.94 KB | 12.12 KB |
In my situation, I’m seeing nearly a 70% reduction in JavaScript. To be fair, as bundles scale up in size, I’ve noticed the savings offered by differential serving becomes noticeably less. In my professional work where I routinely encounter bundle sizes greater than 300 KB, I’ve seen something closer to 10%, but that is still a significant reduction! Much of what worked in my favor was that my particular project needed a fairly large amount of polyfills for the legacy bundle, while I was able to get away with little to no polyfills or transforms for the modern bundle.
It might also be tempting to look at the compression stats and say it doesn’t matter, but you must always keep in mind that compression only lowers the transfer time for a given asset. Compression has no effect on parse/compile/execute time. If a 100 KB JavaScript file compresses to 30 KB with Brotli, yes, users will receive it sooner, but that file is still 100 KB worth of JavaScript to evaluate.
On devices with less processing power and memory, this is a crucial distinction. I often test on a Nokia 2 Android phone, and the effects of differential serving on loading performance are pronounced. Below is a performance trace in Chrome of this device accessing my personal site before I implemented differential serving:
A performance trace of a site in Chrome’s DevTools showing a high amount of scripting activity before differential serving was implemented.
Now here’s how it performs on the same device after differential serving is in place:
A performance trace of a site in Chrome’s DevTools showing a greatly reduced amount of scripting activity after differential serving was implemented.
A ~66% decrease in scripting activity is solid. When sites get interactive more quickly, they’re more usable and enjoyable for everyone.
Conclusion
Differential serving is good stuff. If this trend from HTTPArchive is any indicator, it’s a decent bet that most sites in production are still shipping a lot of polyfilled and transformed legacy JS that’s simply not needed by a lot of users. If we need to support users on legacy browsers, we ought to seriously consider this two-pronged approach to serving JavaScript.
If nothing else, this should serve as a look into the future of JavaScript bundle size, and how maintaining a forward-looking attitude toward JavaScript tooling plays a part in reducing the code we send to users. Depending on your audience, you may not even need to serve two different bundles, but the configurations shown may give you an idea of how to ship less code than you currently are.
It’s also worth noting that the state of JavaScript tooling changes quite often. It feels very much like a “move fast and break things†sort of space at times. For example, an alpha of webpack 5 is already out there, and there are a lot of changes to dig through. It’s not unreasonable to assume that some things may break. Anecdotally, I still see projects on webpack 3 in my line of work. It may take more time to upgrade for some projects than others. This technique – as documented – could still prove useful in the future.
If you’re interested in seeing a site of mine that uses this technique right now, have a look at this repo. Hopefully, you’ll get as much out of this as I have in my projects.