If there is one thing Billy Hoffman believes in, it's transparency. In fact, he once got sued over it, but that is another story. Billy continues to push for transparency as founder and CEO of Zoompf, whose products provide visibility into your website's performance by identifying the specific issues which are slowing your site down. You can follow Zoompf on Twitter and read Billy's performance research on Zoompf's blog Lickity Split.
We all know that 3rd party content means you no longer control all the factors which affect page load time. A sleek, well-tuned, and optimized site can still deliver a poor user experience because of problems with 3rd party content. Steve Souders even used to publish a series of blog posts where he analyzed and rated the performance of 3rd party content snippets. (Dear Steve, please bring this back, it was awesome). Mathias Bynens took this one step further, showing how to additionally optimize Google’s markup and JavaScript snippets.
The surprising lesson to learn from Steve and Mathias is that if you want a fast site and 3rd party widgets, then you need to examine the 3rd party content for performance problems, even when a snippet comes from a trusted authority on web performance. So this post isn’t really going to be about 3rd party content. It’s going to be about trusting advice.
Last week a Zoompf customer, the online precious metal exchange GoldMoney, contacted Support about an issue our technology flagged on their site. We had detected an issue with Google’s JavaScript library for their Google+ button. Zoompf WPO was suggesting the customer do something which was contradicting Google’s advice. And that was enough to give GoldMoney pause.
The specific issue that Zoompf was flagging was that Google’s plusone.js
library was being referenced using SSL from a non-SSL page. SSL is important because, if used properly, it provides communications privacy and integrity. However, a CSS file, or JavaScript library, or even a Favicon that is referenced using a SSL-enabled hyperlink from an HTML page which is not served over SSL most likely does not contain information that needs protecting. Since SSL provides these security features with a cost of a decrease in web performance (as discussed below), it is important to only use SSL when you have to.
In this case, the Google plusone.js
button library does not contain personal or private information. Zoompf’s suggestion was to instead retrieve the Google+ library using http://
instead of https://
. Here is what Google’s documentation has to say (emphasis added):
The +1 button code requires a script from Google’s servers. You may get this error by including the script via
http://
on a page that’s loaded viahttps://
. We recommend usinghttps://
to include the script:
<script type="text/javascript" src="https://apis.google.com/js/plusone.js"></script>
If your web page useshttps://
, some browsers and verification tools will show an error when any assets on the page are called via http://. If your site serves pages via https://, make sure that the +1 button code on those pages also uses https://. (In fact, it’s fine to usehttps://
in the button code for all pages, even if they are only served viahttp://
.)
The “error” that Google is trying to avoid is a mixed content warning. It looks like this:
A mixed content warning happens when an HTML page served with HTTPS references resources using HTTP. Due to some serious design flaws in modern browsers, mixed content can allow privileged information like the DOM, cookies, referrer URLs, session IDs, and more to be access by untrusted parties. Browsers usually display a confusing dialog box or just fail to render the page, depending on its security settings. Google’s solution to avoid all is to just always request the plusjone.js
file using SSL, even when SSL is not needed.
But using SSL, just for the fun of it, is not a good idea. SSL impacts web performance negatively in several ways:
- HTTPS connections take longer to create than regular HTTP connections. Additional requests may need to be sent to different servers to validate the X.509 certificate chain before the SSL connection can begin, causing all pending HTTPS connections to that server to block.
- Establishing an HTTPS connection is computationally expensive. The browser and server must do a large amount of work during the SSL handshake and more work encrypting and decrypting data as it is sent. While computers are always getting faster SSL overhead is still sufficiently large that an entire market for SSL acceleration products exists.
- Because HTTPS runs on a separate TCP/IP port than HTTP, your browser cannot use an existing HTTP connection as an HTTPS connection, even if you are talking to the same hostname.
- Using SSL means inline devices like shared caching servers will not see the traffic and cannot be used to improve performance.
- Browser caching of content served over SSL is more complicated than content over HTTP. Depending on the browser and configuration, content may only be cached in RAM and discarded quickly, or require conditional requests not usually needed.
In short, SSL is great but it’s not free. Don’t use it if you don’t have to.
The solution here is to actually use a protocol relative URL. A protocol relative URL is a way of referencing a resource on a different host name without specifying what protocol to use to retrieve. So instead of src="https://apis.google.com/js/plusone.js"
you can use src="//apis.google.com/js/plusone.js"
. Consider an HTML page which uses a protocol relative URL to reference plusone.js
. If the page was served using https://
, then plusone.js
is requested using https://
. Security is maintained and no mixed content security warning will appear. If the page was served using http://
, then the library will be served using HTTP. No performance hit happens and no caching issues come up either.
Now, I know what you might be thinking. “Did Stoyan seriously allow some guy a spot on the Performance Calendar to talk about protocol relative URLs for eleven paragraphs?” Well yes, I did talk about something cool that many people are not familiar with and which provides an elegant solution to a surprising common problem. (If fact there tons of other stuff to talk about with protocol relative URLs, like a non-standard IE6 configuration which causes a weird certificate error, or the double downloading bug in IE7 and IE8. So count yourself lucky!) But as I said earlier, the magic of protocol relative URLs is not the point of this post.
The point of post is that you need to be careful about performance advice. Not just where you get it, but what it says to do. Google is awesome. They are one of the strongest supporters of web performance in the industry today. But no one is perfect. Mathias improved upon their Google Analytics snippet. Their Google Doodles are always ludicrously high quality JPEGs that needlessly waste bandwidth. And sometimes, like in this case, their advice is not just right. As the Buddha once said:
Believe nothing, no matter where you read it, or who has said it, not even if I have said it, unless it agrees with your own reason and your own common sense.
You should always examine a code snippet from a 3rd party before including it in your site, regardless of who wrote it, even if Steve Souders or Douglas Crockford or John Resig wrote it, to make sure it does not violate any best practices that you already know.