Updates to Google Webmaster Guidelines Warns Users about Blocking JavaScript Files & CSS

Author: Allison Kemmer

Anyone that has had a website up and active for any length of time knows that Google has been crawling sites to try and size up the information and give adequate ranking. The HTML coding has been what is used to make the determinations. More complicated images, CSS and JavaScript has escaped scrutiny. That is all changed now with the current Google Webmaster Guidelines updates. Google has suggested you should allow access to those files for them to be crawled.

The Importance of Ranking

Where your site ranks in search engine results can make or break your business. If no one knows you are out there no one will visit your site. It is important to follow the Google Webmaster Guidelines to give your site and business every chance to be successful. Embracing this change will be critical to longevity on the internet.

How the Updates Help Google Determine Ranking

In an effort to offer the best results for all who search the Google search engine they feel the need to get a more comprehensive look and feel for what your website offers. It goes beyond having quality content. The written words are important, but they also want to view all images, CSS and JavaScript files to get the big picture of what you are offering visitors.

The Technicalities of the Update

If you are unaware of what files might be hidden or blocked from the Googlebot then Google offers some great advice in the support section. Simply test your site using the Fetch and Render tool as Google and the robots.txt Tester tools in Webmaster Tools. This will show you a visual representation of what the Googlebot would see.

Allow open communication between your website and the Googlebot by using the robots.txt file on your particular web server. It will let the crawler know which directories can be crawled and which ones cannot. Keep it updated so that you do not inadvertently block the bot.

The Ultimate Penalty

Google makes it very clear in their warnings of what you can experience if you do not allow complete to your entire site, including CSS and JavaScript files. Pierre Far, Google’s Webmaster Trends Analyst as stating the "new guideline specifies that you should allow Googlebot access to the JavaScript, CSS, and image files that your pages use" all for "optimal rendering and indexing." If you intentionally or accidentally block Googlebot access to these files, he added it "directly harms how well our [Google] algorithms render and index your content and can result in suboptimal rankings." Quality control and constant improvement is the hallmark of any company that wants to experience growth. You can easily use these guideline changes to your advantage.