WordPress Website Tip: "Googlebot Cannot Access CSS and JS Files…" – Error Message

Have you recently received a message from Google warning that they cannot read and index your website optimally?

Google Search Console has begun sending out a brand new warning to webmasters who are blocking CSS and JavaScript on their websites. The warning also states emphatically that blocking Javascript and/or CSS “can result in suboptimal rankings.”

Many websites have noticed an increase in messages from Google Search Console stating that the Googlebot cannot access CSS and JS files for a given URL. The message is a result of Google’s recent initiative to provide greater transparency in search ranking factors.
Below is a picture example of what the message looks like:

Googlebot cannot access css and js files

The solution is in your robots.txt file. First, you may be asking what is robots.txt?

The robots exclusion protocol (REP), or robots.txt is a text file webmasters create to instruct robots (typically search engine robots) how to crawl and index pages on their website. Robots.txt needs to be placed in the top-level directory of a web server in order to be useful. Example: http://www.example.com/robots.txt

To solve this problem:

1. For WordPress Websites Only. Skip to #2 if you don’t have a WP website. Your current robots.txt most likely begins with:

User-agent: *
Disallow: /wp-admin/

Remove the “/” at the end so it becomes:

User-agent: *
Disallow: /wp-admin

2. Look through the robots.txt file for any of the following lines of code:

Disallow: /.js$*

Disallow: /.inc$*

Disallow: /.css$*

Disallow: /.php$*

If you see any of those lines, remove them. That’s what’s blocking Googlebot from crawling the files it needs to render your site as other users can see it.

3. Add this at the end of your robots.txt:

User-Agent: Googlebot
Allow: .css
Allow: .js

Problem solved.

Have questions or need help? Please contact us.