Google Technical Webmaster Guideline Update & WordPress

Gyi Tsakalakis
October 28, 2014

Back in May, Google announced that they had upgraded their system to render web pages with CSS and JavaScript turned on. They recently updated their technical webmaster guidelines to reflect this change.

Interestingly, they also noted:

For optimal rendering and indexing, our new guideline specifies that you should allow Googlebot access to the JavaScript, CSS, and image files that your pages use. This provides you optimal rendering and indexing for your site. Disallowing crawling of Javascript or CSS files in your site’s robots.txt directly harms how well our algorithms render and index your content and can result in suboptimal rankings.

Whoa.

This is one of those rare instances where Google shares a direct negative ranking factor.

My guess is that many of you haven't taken any intentional action to block Javascript or CSS files in your robots.txt. But I would also guess that many of you who use WordPress are blocking the /wp-includes/ and /wp-content/ sub-directories in your robots.txt file. It will look something like this:

wordpress-robots

(editor's note: wp-includes and wp-content should be highlighted, not wp-admin)

This has been a very common and regularly recommended practice for preventing sensitive directories from being easily discovered by hackers.

Unfortunately, in light of Google's recent update, it seems that this might now be problematic. You see, Javascript and CSS files typically reside in these sub-directories. So, blocking them from being crawled may result in suboptimal rankings in Google.

You'll have to decide for yourself how to resolve this issue. You may choose simply to allow Google to crawl these sub-directories. Alternatively, you could just allow crawling of the sub-directories that contain your Javascript and CSS files. Or, you could move these file to a less sensitive area. However, if you do choose to move these files, you might cause so big broken link headaches.

To see if you might have this problem, navigate to:

https://yourdomain.com/robots.txt

If you see:

Disallow: /wp-includes
Disallow: /wp-content/

You're probably blocking Google from crawling your Javascript and CSS files.

Once you've resolved the blocking problem, head over to Google Webmaster Tools and navigate to:

Crawl >> Fetch as Google

If you see Googlebot couldn't get all resources for this page. Here's a list:

Webmaster Tools   Fetch as Google

And that list contains Javascript and CSS files, the issue remains.

Work through your fix until your Javascript and CSS files don't appear as Blocked.

I'm skeptical that this will give you a huge rankings / traffic boost (but if it does, please let us know in the comments). However, in a competitive landscape, every little bit can help.

It's a good practice to regularly fetch and render your site(s) in Google Webmaster Tools. Be sure to review the code that's generated under the "Fetching" tab. This is how Google sees your pages. Make sure that the information that you are sending to Google is what you want Google to see.

Don't forget to use Fetch and Render for both Desktop and Mobile. As the web increasingly goes mobile, search engines will continue to evolve to understand whether your pages are optimized for mobile, which can also impact rankings.

Site updates and inadvertent configuration mistakes can have a serious impact on your sites' ability to appear in search results. The good news is that with regular maintenance, most of these issues are fairly easily to remedy.

Gyi Tsakalakis
Co-Founder of AttorneySync
Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments

Let's Discuss

Questions or comments? Let's discuss on social!
linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram