I was reading this news on TechCrunch about Google anlaytics news feature update named “Bot and Spider Filtering“. This feature helps you to track the exact amount of real visitors ( i mean human visits). Although this is a small update by Google Analytics so far but it can impact in-terms of analytics of visitors for your website.
The reason why they introduce this feature is because, the kind of traffic from search engines and other web spiders can easily skew your data in Google Analytics.
Note : Google is only filtering out traffic from known bots and spiders. It’s using the IAB’s “International Spiders & Bots List” for this, which is updated monthly.
You can simply select a new checkbox option which would be included in the view level of the management user interface. This option would be labeled “Exclude traffic from known bots and spiders”. Selecting this option will exclude all hits that come from bots and spiders on the IAB know bots and spiders list. The backend will exclude hits matching the User Agents named in the list as though they were subject to a profile filter. This will allow you to identify the real number of visitors that are coming to your site.
Once you have opted in to excluding this kind of traffic, Analytics will automatically start filtering your data by comparing hits to your site with that of known User Agents on the list.