Bot-filtering-feature-techblogcorner

I was reading  this news on TechCrunch about Google anlaytics news feature update named “Bot and Spider Filtering“. This feature helps you to track the exact amount of real visitors ( i mean human visits). Although this is a small update by Google Analytics so far but it can impact in-terms of analytics of visitors for your website.

The reason why they introduce this feature is because, the kind of traffic from search engines and other web spiders can easily skew your data in Google Analytics.

Note : Google is only filtering out traffic from known bots and spiders. It’s using the IAB’s “International Spiders & Bots List” for this, which is updated monthly. 

You can simply select a new checkbox option which would be included in the view level of the management user interface. This option would be labeled “Exclude traffic from known bots and spiders”. Selecting this option will exclude all hits that come from bots and spiders on the IAB know bots and spiders list. The backend will exclude hits matching the User Agents named in the list as though they were subject to a profile filter. This will allow you to identify the real number of visitors that are coming to your site.

bot-filtering-google-analytics-techblogcorner

Once you have opted in to excluding this kind of traffic, Analytics will automatically start filtering your data by comparing hits to your site with that of known User Agents on the list.

About the author

Rahul Setia

Rahul Setia as Digital Marketing Head works for Contentmart.com. As an Digital Head, he enjoys building marketing strategies, delivering into website data analysis, and writing content., On twitter @rahulsetia007 and Facebook.

Leave a Comment