To make things easier to debug, we’re currently working on a tool for helping webmasters better understand how Google renders their site. We look forward to making it to available for you in the coming days in Webmaster Tools.
Just login to your webmaster account > click on your website name > Go to Crawl > then click on Fetch as Googlebot
There you will find the button fetch and render, if you are visiting your webmaster account frequently then you will find this easily.
Googlebot follows the robots.txt directives for all files that it fetches. If you are disallowing crawling of some of these files (or if they are embedded from a third-party server that’s disallowing Googlebot’s crawling of them), we won’t be able to show them to you in the rendered view. Similarly, if the server fails to respond or returns errors, then we won’t be able to use those either (you can find similar issues in the Crawl Errors section of Webmaster Tools). If we run across either of these issues, we’ll show them below the preview image.
Here is complete list of common debug issues which is below:
- If your web server is unable to handle the volume of crawl requests for resources, it may have a negative impact on Google’s capability to render your pages. If you’d like to ensure that your pages can be rendered by Google, make sure your servers are able to handle crawl requests for resources.
At the end of their offcial post on webmaster blog, they stated that :
Some types of content – such as social media buttons, fonts or website-analytics scripts – tend not to meaningfully contribute to the visible content or layout, and can be left disallowed from crawling.
If you like this post, please leave your feedback as comment below.