I maintain 3 websites. My main site – WP-Tweaks.com, my personal blog, and my professional site. I don’t focus too much effort on the latter two, and they both get low traffic, which is fine for me. There was a time when I used to work a lot more on my blog, but that was a long time ago. I recently wanted to test BunnyCDN, and not wanting to experiment on my main site, I tried it out on the other two instead. And what I learned was this – A CDN doesn’t really make a difference to low traffic sites.
Graphs Showing the Hit Ratio
I started the experiment close to a month ago and figured that would give the CDN enough time to “warm” up and stabilize. I don’t actively work on either of the two sites, so it should be a good way to test things out. After over 3 weeks, here’s the result:
As you can see, the CDN has served more uncached bandwidth than cached bandwidth even after all this time. The hit ratio is just 60%, whereas I expected it to be a lot more. Keep in mind that this is just for static assets. Dynamic HTML requests are not going through the CDN.
CDNs Clear Infrequently Accessed Items
CDNs will tell you to configure your “max-age” cookie as a signal for how long you want the content to remain on their servers without being refreshed. But what they don’t tell you is that none of them respect it. CDNs can and will clear their cache well in advance of the max-age directive if your content is infrequently accessed.
From their point of view, it’s a storage issue. They don’t want to hold onto content that might not be accessed for a long time. So all CDNs have an algorithm that determines which content is widely used, and they purge the other stuff aggressively. They all do this. I’ve tested this with Cloudflare, KeyCDN, and as you can see above, BunnyCDN.
BunnyCDNs “Persistent Storage” Doesn’t Seem to Help
I was attracted to BunnyCDN because they have a feature called “Perma-Cache”, which is supposed to replicate your data across all their data centers and store it permanently. They have a special add-on service for this, where you pay for the cost of the storage. A fair deal in my opinion. I thought it would solve the problem of infrequently accessed content, and all my files would need to be requested just once.
Unfortunately, it doesn’t seem to work. So far, the system informs me that 288 files of mine have been cached in permanent storage, and I would have expected the cache hit ratio to slowly keep climbing as more and more static files are stored. But that hasn’t materialized. After 3 weeks, I don’t see the cache hit ratio going up. So there must be some nuance that I’m missing.
No Option But to Optimize the Server to Serve Static Files
Given that no CDN is going to keep your files on their servers permanently, you need to optimize your server to serve static files. On Apache, that means using a reverse proxy. Or if you’re using NameHero, the LiteSpeed webserver does the same thing.
Another measure you can take is to protect your site from bots. Cloudflare has a great feature for “Pro” users and above that allows you to block “definite” bots, with exceptions for the “good” bots like the known search engines.
This feature alone knocks out around 20% of annoying traffic. Unfortunately, you can see that the “good” bots account for a lot more, and you don’t want to stop these. If your CDN handles 60% of these requests, then that’s 16% of the traffic that’s hitting your website directly. And this traffic tends to come in bursts – it’s not spread out gracefully. So when that burst hits, you had better make sure that your server is ready to handle the influx!
I’m a NameHero team member, and an expert on WordPress and web hosting. I’ve been in this industry since 2008. I’ve also developed apps on Android and have written extensive tutorials on managing Linux servers. You can contact me on my website WP-Tweaks.com!
Leave a Reply