Much of SEO is “fuzzy”. So it makes sense that we try and maximize whatever metrics we can. But metric chasing can be misleading. For example:
- Pageviews (without concern for targeted traffic)
- Rankings (without concern for audience intention)
- Number of backlinks (without concern for quality/relevance)
All these metrics are good – when applied properly. Otherwise, you have the illusion of progress, but nothing more. I can get hundreds of thousands of pageviews, but unless my site is serving their intent that leads to conversions, it’s completely useless.
Page speed is yet another such metric. This has received renewed focus with Google’s Lighthouse project, which gives you detailed information about how your site performs on a variety of metrics – accessibility, loading etc, and assigns you a percentage. We also know that page speed plays a part (but a small one) in ranking.
As a result of this, and due to people’s need to improve metrics that provide an illusion, I see a lot of individuals overly focused on getting higher page speed scores. To the point that they almost break their site to achieve that magic “100%” optimization! Here are some insights into how important page speed is.
Insight 1: Most Sites SHOULDN’T Aim for 100% Optimization
A quick Google search will show you people asking for tips on how to remove that last pesky jQuery script and move it to the footer. And while it’s possible to completely optimize your site loading speed, most of the time it leads to a loss of functionality.
Take jQuery for example. It’s so ubiquitous, that you probably need to load it quickly in order for other scripts to make use of it. If you try and “defer” or “async” it, chances are that you’re going to break your site functionality in some unforeseen way. Even the simplest sites tend to use some form of jquery – whether it’s for search, or social counts, table of contents, or something else.
In other words, unless your website is nothing by plain HTML with no external scripts, you’ll need to load at least something in the header.
Insight 2: Simultaneous Resource Downloads Means you Can Get away with More
This doesn’t mean that you don’t implement best practices like inlining “above the fold” CSS of course. That’s still important. It just means that with HTTP/2 allowing for unlimited multiplexing, you can happily download other things with jQuery without slowing down your load time.
But your PageSpeed score will complain if you do this. Just an example of why PageSpeed isn’t always representative of user experiences.
Insight 3: The More Optimized your Are, the More Things Can Go Wrong
I’m a fan of Nicholas Taleb’s ideas on fragility, and a key insight is that over-optimization and too much efficiency, makes systems vulnerable to volatility.
I’ve experienced this first hand on my own web pages. After a point, the effort required to squeeze out a few milliseconds of speed is entirely disproportional to the benefits. Any compromises you’re required to make in order to achieve those milliseconds make your site ripe for failure.
So before making any specific change, ask yourself “Can something go wrong?” Try not to think about specifics, and instead try and get a feel for how “fragile” whether or not it increases your site’s fragility. Chances are, your gut instinct will be right, and if a change smells bad, stay away from it. It’s usually not worth it.
It is preferable to err on the side of reliability, rather than efficiency. If you can obtain a good page speed score with basic best practices, then go for it and be happy. But learn to recognize the line after which you start getting diminishing returns.
Stop there, and go no further!
I’m a NameHero team member, and an expert on WordPress and web hosting. I’ve been in this industry since 2008. I’ve also developed apps on Android and have written extensive tutorials on managing Linux servers. You can contact me on my website WP-Tweaks.com!