I’m always monitoring my traffic to see which requests are causing the most impact on my origin server. I’ve already shown how to protect your WordPress login page using firewall rules and the challenges of rate-limiting. But you need to be on a constant lookout to see what new vulnerabilities are being exploited. If you don’t, your site performance will suffer and you’ll never even know why.
Enumeration Attacks are Dangerous
Enumeration attacks are when an automated script finds a way to constantly ping your server with a new URL by changing a query parameter somewhere. Since each of these is a new URL, your server has difficulty blocking it via pattern matching. And it’s not easy to say which traffic is valid and which isn’t until a trend is visible.
A couple of days back, I saw an unusual spike of traffic hitting my origin server on WP-Tweaks. Nothing too big, but enough to warrant a closer look:
The blue spike is requests hitting my origin server and bypassing Cloudflare’s cache. Since Cloudflare doesn’t break down analytics by query parameters, I had to open the raw access logs of my site from cPanel. When I zoomed into the particular time period during which the spike took place, here is what I found:
The same IP address was hitting my pages and making a separate request for each comment ID. This was causing 301 redirects, and forcing my server to provide the same page over and over. As resource usage goes, this wasn’t anything serious. My server was handling it just fine. But I see the potential for abuse here.
This particular enumeration attack involved the “replytocom” query parameter.
What is the “replytocom” Parameter in WordPress?
In WordPress, each “Reply” button on a comment is assigned a unique parameter which is nothing but the comment ID. Here’s a screenshot from an article on my site:
So the fix is twofold.
For Well Behaved Bots
For bots that follow the rules and obey robots.txt, the solution is a simple one. Simply add this line to your robots.txt and you’re done:
This will ensure that search engines and others don’t follow useless links and cause your website to slow down as thousands upon thousands of unique URLs are requested and processed by your origin server. The problem, of course, is that robots.txt is only meaningful to well-behaved bots that follow the industry standards. It’s not going to stop anyone who wants to DDoS your site.
For Bad Behaved Bots
As mentioned above, there’s no reason why anyone would want to visit a “replytocom” link on your website. So I solve the problem with a simple firewall rule that blocks all the URLs in their entirety:
This works nicely. So now when a bot tries to crawl the comments section on my site, it’ll just get a “blocked” response. Problem solved!
I’m a NameHero team member, and an expert on WordPress and web hosting. I’ve been in this industry since 2008. I’ve also developed apps on Android and have written extensive tutorials on managing Linux servers. You can contact me on my website WP-Tweaks.com!