Google Webmaster Tool adds “Fetch and Render” feature for detecting JavaScript Errors

Google-webmaster-tool-2014-techblogcorner

Some days back, i was reading some where, Google webmaster tool is going to launch some feature to detect content in JavaScript. Even in Previous years, they announced, just block JavaScript for a while.

But now they have officially launched a new feature to read content in JavaScript named”Fetch and Render” in Google webmaster tool. Here is an official statement from team member of Google.

To make things easier to debug, we’re currently working on a tool for helping webmasters better understand how Google renders their site. We look forward to making it to available for you in the coming days in Webmaster Tools.

Just login to your webmaster account > click on your website name > Go to Crawl > then click on Fetch as Googlebot

There you will find the button fetch and render, if you are visiting your webmaster account frequently then you will find this easily.

Google-webmaster-fetch-asgoogle-render-feature-2014

Click on Status as Partial, ( it is showing status as partial as we have already blocked some parts of our website not to be crawled by search engine spider )  after click on it, you will see which section of your current page JavaScript blocked for crawlers to read your website content.

Javascript-blocked-google-webmaster-tool

Googlebot follows the robots.txt directives for all files that it fetches. If you are disallowing crawling of some of these files (or if they are embedded from a third-party server that’s disallowing Googlebot’s crawling of them), we won’t be able to show them to you in the rendered view. Similarly, if the server fails to respond or returns errors, then we won’t be able to use those either (you can find similar issues in the Crawl Errors section of Webmaster Tools). If we run across either of these issues, we’ll show them below the preview image.

It is advisable now, to unblock your CSS and JavaScript to read the content from search engine spiders.s o that they can index your site content more efficiently.

Must Read-  Good News: Now Reports Crawl Errors On The Final Redirect URL by Webmasters Tools

Here is complete list of common debug issues which is below:

  • If resources like JavaScript or CSS in separate files are blocked (say, with robots.txt) so that Googlebot can’t retrieve them, Google’s indexing systems won’t be able to see your site like an average user. Google recommends allowing Googlebot to retrieve JavaScript and CSS so that your content can be indexed better. This is especially important for mobile websites, where external resources like CSS and JavaScript help Google’s algorithms understand that the pages are optimized for mobile.
  • If your web server is unable to handle the volume of crawl requests for resources, it may have a negative impact on Google’s capability to render your pages. If you’d like to ensure that your pages can be rendered by Google, make sure your servers are able to handle crawl requests for resources.
  • It’s always a good idea to have your site degrade gracefully. This will help users enjoy your content even if their browser doesn’t have compatible JavaScript implementations. It will also help visitors with JavaScript disabled or off, as well as search engines that can’t execute JavaScript yet.
  • Sometimes the JavaScript may be too complex or arcane for us to execute, in which case Google can’t render the page fully and accurately.
  • Some JavaScript removes content from the page rather than adding, which prevents Google from indexing the content.

 

At the end of their offcial post on webmaster blog, they stated that :

Some types of content – such as social media buttons, fonts or website-analytics scripts – tend not to meaningfully contribute to the visible content or layout, and can be left disallowed from crawling.

 

Must Read-  17 Facts regarding Social Media That’ll build Your Hair Stand on End! [Info-Graphics]

If you like this post, please leave your feedback as comment below.

 

4 thoughts on “Google Webmaster Tool adds “Fetch and Render” feature for detecting JavaScript Errors

  1. Hey Rahul great post ,,, i have a query like after fetch and render the page in webmaster google bot shows the blank page and to visitor it shows the actual page… What it means…..

Leave a Reply