Convertmind knowledge base
In order to understand what bot- and spider filtering in Google Analytics does, we must first quickly establish what bots and spiders are exactly. Bots are essentially automated programs that visit your website. Examples include search engines indexing your site, web archives making a copy of your site, a program checking for new content to show a user in their general feed, and possible services you’ve hired yourself to check the status of your site. A spider is just a specific type of bot designed to ‘crawl’ the web to gain information. Search engine indexing is done by spiders, for example.
Some of these bots can cause problems with your Analytics data. Some bots will trigger the Google Analytics code when they visit your website, which will cause a hit to be registered. These bots are sometimes known as smart bots.
These bots may register as a hit in Google Analytics, but when it comes to your sales these bots are of zero value for you. They’re not real people after all. This means that your data will essentially contain false information, which skews your results. This is particularly problematic if a lot of bots visit your site at once, or one bot visits your site a lot of times in a short timespan. This can cause a massive spike in your data, which you’d need to constantly account for, filter out and compensate for everytime you want to use any of the data from that point onwards.
This is where bot and spider filtering comes in. Bot and spider filtering is a feature in Analytics that will automatically filter bots and spiders from your data. When this is enabled, Google Analytics will detect bots coming to your site and take them out of the results for you. It does this using a database called the IAB/ABC International Spiders & Bots List. This is a large database containing most known bots out there, that’s also continually being updated.
If you enable bot and spider filtering, Google Analytics will check all data it gets with this database. If it detects traffic that matches an entry in the database, it will know that’s a bot and take it out of your results. A few bots may still slip through, but the vast majority of bots will be caught this way. This way, you can be sure your data is an accurate representation of your actual visitors.
Bot and spider filtering can be enabled in the view settings under the admin settings.
Before you enable bot filtering in your main view, it’s recommended you first try it out in an extra view. As with any changes you make to your view settings, it’s important to have a backup view without any filters. This way, should something somehow go wrong with the filters, you still have a backup of your data you can use.
Bot and spider filtering can be enabled by following these steps:
Bot and spider filtering is now enabled. Keep in mind: this only affects your data from this point on. Earlier data will not be affected.
Google Analytics is able to ignore bots. However, it won’t do this automatically. You need to enable this.
By default, Google Analytics will not ignore bots. This means that if a bot visits your site and it triggers the Google Analytics code, it will be registered as a session in your Google Analytics data. This could skew your data, as these bots do not represent your actual human visitors.
Google Analytics does have a built-in feature which will cause Google Analytics to automatically filter any traffic it can detect comes from bots. If you want to enable this filtering, you can read all about it a few lines higher in this article.
It should be mentioned though that this feature isn’t 100% foolproof. In order to find out which traffic comes from bots, Google Analytics uses a large database of known bots. This database is continually being updated, but it’s never perfect. A few new or smaller bots could still slip through. Generally speaking these bots will be negligible, but keep this in mind when you do see an odd spike in your traffic. You can still filter this manually.
Generally speaking, it’s almost never good to have bots in your Google Analytics data. Unless there are specific types of bots you want to be included for a specific reason, any traffic from bots will cause your results to be at least somewhat skewed. After all, these bots are not real humans, and will never convert on your site unless they’re programmed to. So while bots can be very useful for a variety of purposes, within the context of your Google Analytics data they are generally all bad.
Bots may or may not trigger a session in Google Analytics. This depends on the type of bot and how it interacts with your site. Bots that don’t trigger Google Analytics are completely harmless for your data. However, unless you have insight into the coding of the bot you want to know about, there unfortunately isn’t a way to check whether a bot could impact your data or not.
Some bots could potentially do more damage than others. If one visits your site a lot of times in a short timespan for example, that could cause a spike in your traffic which could heavily skew your data. Unfortunately, if you’re filtering manually, there are little to no ways to know about this ahead of time. You will have to read this out manually in Google Analytics.
It’s possible that, with or without the automatic bot filter enabled, some bots may slip through the cracks and pop up in your Google Analytics data. If this happens, you will have to find them and filter them manually.
Google Analytics unfortunately doesn’t have a button to easily see all bot traffic. You will have to identify it manually. Generally speaking, you can do this by looking for oddities in your data. Look for a certain segment that displays particularly odd behaviour. Common places you can find this include the service provider, screen resolution and geographical location for national sites.
Look for a segment with highly unrealistic results. To give an example, we run a Dutch website which saw a lot of traffic from India. All of those visits were no longer than two seconds, only visited one page and had a bounce rate of 100%. This was clearly coming from bots. Tip: bots often visit only one page on a site for only a second or two with a 100% bounce rate. This isn’t a given though, and bot behaviour could vary wildly depending on what kind of bot you’re dealing with.
Once you’ve found a segment like this, try to isolate it as much as possible. When compensating for these results, you definitely don’t want to filter out any actual human visitors along with it. So try to narrow it down as much as possible to the most specific demographic you can get, so you can be 100% sure you’re not filtering out any actual visitors.
Once you’ve found a bot in your data, you can then manually filter it from your results using the specific information you found out about it. In order to do this, you can create a filter within your view. To learn more about how to do this, we recommend the dedicated article about filtering in GA.
Let smart algorithms audit your Google Analytics data. Find hidden conversion leaks and increase your conversions.