For a long time, I didn’t give much thought to caching static content on a server level. After all, I already use a CDN (Cloudflare), so why should I bother about how fast static files are served? A few days back as the previous year rolled over into the other, I understood how important it was to have a reverse proxy sitting in front of Apache to speed up serving not just dynamic, but static files as well. Here’s what happened.
Over the New Year, a bunch of legitimate bots crawled my site – Ahrefs, SemRush, and more. This continued over a few hours. They requested not just dynamic content, but every single image on my site. Here’s the spike in web traffic around 4:00 in the morning:
None of this was malicious. I checked the logs, and it was just normal bots requesting content every couple of seconds. Each file was requested just once, so they didn’t benefit from CDN caching. I then checked my page speed logs and found a huge increase in the response time of my site. Here’s the screenshot from Cloudflare’s page performance monitor:
Luckily, 4:00 am isn’t when I get my most important traffic. But you can see that the response times on my site really spiked when all those static files were being requested. I noticed the same thing on my Google Analytics page speed reports which I’ve configured to sample 100% of incoming requests.
This isn’t the first time I’ve seen this. It isn’t an issue that can be solved using caching plugins on WordPress or a CDN.
Why a CDN Doesn’t Help Here
CDN providers purge static files that aren’t accessed for about two days. I’ve tested this on all major CDN providers like Cloudflare, KeyCDN, and BunnyCDN. They all purge infrequent content. Every site has a large chunk of infrequently accessed content – I’d even say the bulk of images is barely accessed. All these files are uncached.
So when a bot systematically accesses your images one by one for indexing, all those requests hit your server directly. A CDN is great for handling traffic spikes to a few pages – the way human visitors work. But it simply isn’t helpful against bots pulling your files one by one.
Caching Plugins Don’t Solve This Either
Plugins like W3 Total Cache store static copies of your dynamic files. They still work through PHP and Apache. They don’t speed up the delivery of static content, and they wouldn’t have helped me deal with the traffic spike you see in the screenshots.
The fundamental problem here is that Apache by itself just doesn’t do very well in serving static content.
Solution 1: Use a Reverse Proxy like NGINX or Varnish with WordPress
This problem isn’t new. It’s the main reason why so many people use reverse proxies that sit in front of Apache and cache static content to serve it much faster than Apache can.
The consensus appears to be that NGINX is easier to set up than Varnish. Of course, you can’t do anything about this if you’re on shared hosting that doesn’t have a reverse proxy. But if you’re on a VPS, then you can try and set it up on your own. Engintron is a neat cPanel plugin for NGINX which lets you set up NGINX easily in a few clicks.
For other control panels, the setup is trickier.
Solution 2: Use LiteSpeed with NameHero
NameHero uses LiteSpeed, which performs just as well by itself as Apache does with NGINX as a reverse proxy. Here’s a feature comparison on the LiteSpeed website showing the difference between LiteSpeed, Varnish, and NGINX as a reverse-proxy.
If you already use NameHero as a web host, then you’re in luck – LiteSpeed is enabled on all shared hosting plans by default! So you don’t have to worry about situations like the one above where static files requested by robots slow down your site. LiteSpeed is already caching your static files and you don’t need to use a server-based reverse proxy like NGINX.
I’m a NameHero team member, and an expert on WordPress and web hosting. I’ve been in this industry since 2008. I’ve also developed apps on Android and have written extensive tutorials on managing Linux servers. You can contact me on my website WP-Tweaks.com!