Google Webmaster Tool adds “Fetch and Render” feature for detecting JavaScript Errors

Disclosure: When you purchase a service or a product through our links, we sometimes earn a commission.


Some days back, i was reading some where, Google webmaster tool is going to launch some feature to detect content in JavaScript. Even in Previous years, they announced, just block JavaScript for a while.

But now they have officially launched a new feature to read content in JavaScript named”Fetch and Render” in Google webmaster tool. Here is an official statement from team member of Google.

To make things easier to debug, we’re currently working on a tool for helping webmasters better understand how Google renders their site. We look forward to making it to available for you in the coming days in Webmaster Tools.

Just login to your webmaster account > click on your website name > Go to Crawl > then click on Fetch as Googlebot

There you will find the button fetch and render, if you are visiting your webmaster account frequently then you will find this easily.



Click on Status as Partial, ( it is showing status as partial as we have already blocked some parts of our website not to be crawled by search engine spider )  after click on it, you will see which section of your current page JavaScript blocked for crawlers to read your website content.


Googlebot follows the robots.txt directives for all files that it fetches. If you are disallowing crawling of some of these files (or if they are embedded from a third-party server that’s disallowing Googlebot’s crawling of them), we won’t be able to show them to you in the rendered view. Similarly, if the server fails to respond or returns errors, then we won’t be able to use those either (you can find similar issues in the Crawl Errors section of Webmaster Tools). If we run across either of these issues, we’ll show them below the preview image.

It is advisable now, to unblock your CSS and JavaScript to read the content from search engine spiders.s o that they can index your site content more efficiently.

Here is complete list of common debug issues which is below:

  • If resources like JavaScript or CSS in separate files are blocked (say, with robots.txt) so that Googlebot can’t retrieve them, Google’s indexing systems won’t be able to see your site like an average user. Google recommends allowing Googlebot to retrieve JavaScript and CSS so that your content can be indexed better. This is especially important for mobile websites, where external resources like CSS and JavaScript help Google’s algorithms understand that the pages are optimized for mobile.
  • If your web server is unable to handle the volume of crawl requests for resources, it may have a negative impact on Google’s capability to render your pages. If you’d like to ensure that your pages can be rendered by Google, make sure your servers are able to handle crawl requests for resources.
  • It’s always a good idea to have your site degrade gracefully. This will help users enjoy your content even if their browser doesn’t have compatible JavaScript implementations. It will also help visitors with JavaScript disabled or off, as well as search engines that can’t execute JavaScript yet.
  • Sometimes the JavaScript may be too complex or arcane for us to execute, in which case Google can’t render the page fully and accurately.
  • Some JavaScript removes content from the page rather than adding, which prevents Google from indexing the content.


At the end of their offcial post on webmaster blog, they stated that :

Some types of content – such as social media buttons, fonts or website-analytics scripts – tend not to meaningfully contribute to the visible content or layout, and can be left disallowed from crawling.


If you like this post, please leave your feedback as comment below.


Don't miss out!
Learn How to 10X Your Blog Traffic
Invalid email address
Give it a try. You can unsubscribe at any time.

About the author


Rahul Setia as SEO Analyst works for NeotericUK, based in London, UK. As an SEO, he enjoys building marketing strategies for clients, delving into website data analysis, and writing content. Outside of SEO, he enjoys playing the Cricket and listening music, although not at the same time.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.