Blog

Home / Articles  / You’ve Been Warned, Says Googlebot

How to Optimize After Googlebot’s “Cannot Access Your JavaScript and CSS Files” Warnings

jsandcsswarnings

Received any warnings recently from Google Search Console? Back on October 27, 2014, Google announced they had updated their Webmaster Guidelines to include information about JavaScript and CSS file blocking. If you haven’t received any warnings recently, no need to worry. However, if you were indeed a lucky recipient of one or these more recent warnings pertaining to this past update, your site may have some JavaScript and CSS blocking issues. These issues need to be fixed in order for Google to properly crawl and index your site. But, why? Well let’s continue down that rabbit hole.

What’s the Big Deal Here?

Fixing these issues is important because when Google cannot crawl your site properly, the search engine cannot log a full picture of what your website has to offer its users. Without this level of visibility, Google’s trust in your site diminishes, making the search engine less likely to serve your content to its querying users.

In more technical terms… By blocking these file types, you affect how well Google’s algorithms render and index your content, which, of course, may have a negative impact on your site’s overall rankings. JavaScript and CSS files play a huge role in creating a website’s look and feel, and, in some cases, a site’s content may even be JavaScript-dependent. By blocking your scripts, you risk hiding functionalities and visuals.

Basically, if Google can’t crawl your site, Google can’t trust your site, and Google won’t serve your content. No Bueno. So, what’s the fix?

How Can I Avoid This?

Fixing these issues is, in most cases, quite simple. Here are a few key tools you can use.

First off, use Google’s Fetch as Google tool before you fix the issues, to help identify exactly which files are being blocked and how said blockage is affecting Google’s visibility on each page of your site. Then, use this information to prioritize your optimization plan, allowing those blocked files that are preventing important content/functionality from being seen first, and fixing less important pages secondarily.

You can also use Google’s Mobile-Friendly test to make sure robots.txt isn’t blocking any of your site’s CSS or JS files. This kind of blockage may impact the responsiveness of your site, keeping Google from seeing your site as mobile-friendly, even if it actually is mobile-friendly.

After you run the Fetch as Google tool, check to see if you currently have any scripts trapped in disallowed folders in your robots.txt file. Below are examples:

Disallow: /media/                   Check your file management system for any JavaScript or CSS
Disallow: /plugins/                  that may accidently fall into any of your disallowed folders.

Remove any Disallow: /.js$* and Disallow: /.css$*

To fix these issues, move JavaScript and CSS files outside of blocked directories, or remove the disallow commands themselves. You can also update the commands to allow the specific subdirectories where these files live, while continuing to disallow higher-level directories. Once you are done fixing those issues, Fetch as Google again to check your work and catch any lingering errors you may have missed.

In the end Google warned us about these errors last year. Though they are not hard to fix, we should still optimize our sites and follow this protocol.

Did you like this post? Then you’ll love our newsletter! Subscribe and get leading industry insights delivered straight to your inbox.

Post a Comment