What’s the Big Deal Here?
Fixing these issues is important because when Google cannot crawl your site properly, the search engine cannot log a full picture of what your website has to offer its users. Without this level of visibility, Google’s trust in your site diminishes, making the search engine less likely to serve your content to its querying users.
Basically, if Google can’t crawl your site, Google can’t trust your site, and Google won’t serve your content. No Bueno. So, what’s the fix?
How Can I Avoid This?
Fixing these issues is, in most cases, quite simple. Here are a few key tools you can use.
First off, use Google’s Fetch as Google tool before you fix the issues, to help identify exactly which files are being blocked and how said blockage is affecting Google’s visibility on each page of your site. Then, use this information to prioritize your optimization plan, allowing those blocked files that are preventing important content/functionality from being seen first, and fixing less important pages secondarily.
You can also use Google’s Mobile-Friendly test to make sure robots.txt isn’t blocking any of your site’s CSS or JS files. This kind of blockage may impact the responsiveness of your site, keeping Google from seeing your site as mobile-friendly, even if it actually is mobile-friendly.
After you run the Fetch as Google tool, check to see if you currently have any scripts trapped in disallowed folders in your robots.txt file. Below are examples:
Disallow: /plugins/ that may accidently fall into any of your disallowed folders.
Remove any Disallow: /.js$* and Disallow: /.css$*
In the end Google warned us about these errors last year. Though they are not hard to fix, we should still optimize our sites and follow this protocol.
Did you like this post? Then you’ll love our newsletter! Subscribe and get leading industry insights delivered straight to your inbox.