Allow Googlebot Access To Javascript And CSS For Optimal Rankings

You may remember an announcement made by Google several months ago that its indexing system now renders web pages more like a typical web browser.

As a direct follow up to that change, Google has just announced an update to one of its technical Webmaster Guidelines. The update specifies that, for optimal rendering and indexing, you should allow Googlebot access to the JavaScript, CSS, and image files used by your page.

Google warns against using robots.txt to disallow the crawling of Javascript or CSS files, saying that it directly harms how well Google’s algorithms render and index site content. This can result in “suboptimal rankings” the company says.

Further Advice For Optimal Indexing

Google goes on to provide a few additional pieces of advice with the new perspective that Google’s indexing systems now render web pages like modern web browsers. This advice specifically pertains to site speed and how it can be optimized for better indexing.

Pages that render quickly are also indexed more efficiently. Google lays out some best practices for page performance optimization:

  • Eliminate unnecessary downloads
  • Optimize the serving of your CSS and JavaScript files by concatenating (merging) your separate CSS and JavaScript files, minifying the concatenated files, and configuring your web server to serve them compressed (usually gzip compression)
  • Make sure your server can handle the additional load for serving of JavaScript and CSS files to Googlebot.

To see how Googlebot renders your page, and to troubleshoot and indexing issues, the Fetch and Render as Google feature in Webmaster Tools has been recently updated.

By Matt Southern SEO Tips.


0 responses to “Allow Googlebot Access To Javascript And CSS For Optimal Rankings”

Leave a Reply

Your email address will not be published. Required fields are marked *