A few days back, I got this notification from the Google search console on one of my older sites:
Apparently, there were sudden “mobile usability” issues – and the problems were hard to debug. Namely:
- Clickable elements too close together
- Text too small to read
Now I always use responsive themes for my sites – I’ve mentioned earlier that responsive sites are the way forward. So this message from Google was a bit strange. I figured that since the messages were about text being too small and close together, I should increase the font size of some of the sidebar items, which I perceived as being too cluttered. I then submitted my site for re-indexing.
Unfortunately, the next day I got a message saying that the issue still exists:
At this point, I just didn’t know what to do. The messages in the search console weren’t super helpful. It didn’t specify which text was too close together and which text was too small to read! So I had to do a lot more debugging.
Making Google Render the Page
In an attempt to narrow down the problem, I wanted to see what Google saw. In the new search console, you can inspect the problem page and click the text “View Crawled Page”. This will open up a panel on the right-hand side where you can test the live URL as seen here:
The Googlebot will then crawl your page and render it as seen on a mobile screen. And this is what I saw:
Clearly something is wrong. There’s no CSS at all! I initially thought it might have been a problem, but I repeated the tests and saw the same thing again and again. This was what Google was seeing on my site. I loaded it in a regular web browser, and everything looked great.
So what was the problem?
A short while ago, I was having problems with excessivebots and had created what I now realize was an overly restrictive robots.txt. Specifically, I had blocked off certain sections of my WordPress from crawlability.
These were the rules I was implementing:
I’m not sure right now, but I might even have been blocking the “wp-content” folder! And as we all know, that’s where we find out theme’s stylesheet. Having such a restrictive robots.txt was a terrible solution for two reasons:
- Spam bots are not going to respect robots.txt anyway
- Genuine and well-behaved crawlers like the Googlebot are sticklers for it
So the end result was that Google though my site looked like utter crap on mobile phones because it wasn’t able to download the necessary stylesheets.
Once I relaxed the rules on robots.txt, I was finally able to get Google to render my website the way it’s supposed to look. And voila! My problems disappeared.
Mobile First Indexing Changes Things
Always remember that Google now crawls your site as a mobile page first. That means your site has to look good on a mobile phone if you want to impress Google. It’s one of the reasons I’ve stopped using tables that change their layout on small screens, and instead opted for scrolling horizontally instead. The reason is that when Google picks up my featured snippets, it’s doing so from the mobile version and it doesn’t look good at all.
This is an important paradigm shift that all of us who design websites have to consider. And that includes making parts of our site accessible to the Googlebot that we wouldn’t normally worry about when viewing it on a regular web browser. So the next time you have mobile usability warnings on your search console, take a look at robots.txt to see if you have anything overly restrictive.
I’m a NameHero team member, and an expert on WordPress and web hosting. I’ve been in this industry since 2008. I’ve also developed apps on Android and have written extensive tutorials on managing Linux servers. You can contact me on my website WP-Tweaks.com!