Bot traffic describes any non-human traffic that visits a website. Whether the website is a hugely popular news site or a small, newly published startup, the site is bound to be visited by a certain number of bots over time.
While the term 'bot traffic' is often misconstrued to be inherently harmful, this is not always the case. There is no doubt that some bot traffic is designed to be malicious and can negatively affect analytics data. These web crawlers can be used for credential stuffing, data scraping, and in some cases, even launching distributed denial of service (DDoS) attacks.
However, web robots are also essential for the operation of specific web services such as search engines and digital assistants. Therefore, digital publishers need to use their analytics data to discern between human behaviour and the good, the bad, and the ugly of bot traffic.
As mentioned, specific bots are required for the operation and optimal performance of search engines and digital assistants. However, specific search engine bots are explicitly designed to cause damage to sites and user experience.
The types of bots to watch out for include:
- Click Bots
Click bots are used for click spamming by making fraudulent ad clicks. For most web publishers, particularly those using Pay Per Click (PPC ads), this is considered the most damaging type of bot. This is because click bots cause a skew in the data analytics, replicating web traffic and consequently eroding the budget with no benefit to the publisher.
- Download Bots
Similar to click bots, download bots also interfere with genuine user engagement data. However, rather than affecting the ad click count, they create a fake download count. This is most pertinent when a publisher uses a marketing funnel, for example, a free ebook download. Download bots create a phoney download, leading to false performance data.
- Spam Bots
Spambots are the most common bot. The purpose of a spambot is often to scrape contact information, including email addresses and phone numbers, create fake user accounts, or operate stolen social media accounts. They also disrupt user engagement through the distribution of unwarranted content, such as:
- Spam comments, including referral spam
- Phishing emails
- Ads
- Website redirects
- Negative SEO against competitors
- Spy Bots
Spy bots are so named because they act in precisely such a manner - as spies. They steal data and information, such as email addresses from websites, chat rooms, social media sites, and forums.
- Scraper Bots
Scraper bots visit websites with the sole malicious intent of stealing publishers' content. Scraper bots can create a real threat to a business and its web pages. Created by third-party scrapers, they are employed by business competitors to steal valuable content, such as lists of products and prices which are then repurposed and published on competitor sites.
- Imposter
Bots Imposter bots replicate human behaviour by appearing as genuine website visitors. They intend to bypass online security measures, and they are the bots most often responsible for DDoS activity.
While the above examples are undoubtedly cases of harmful bot traffic, what are some instances of good bot traffic? The following bots are legitimate and are there to provide helpful solutions for websites and applications.
- Search Engine Bots
Search engine bots are the most obvious and well-known of the 'good' bots. Search engine bots crawl the web and help website owners get their websites listed in search results on Google, Yahoo, and Bing. These bots are helpful SEO tools.
- Monitoring Bots
Monitoring bots help publishers ensure their website is healthy and accessible while operating at peak performance. Monitoring bots operate by automatically pinging the site to ensure it is still online. If anything breaks or the site goes offline, the publisher will be automatically notified, making these bots very useful to site owners.
- SEO Crawlers
SEO crawlers are software that crawls a website and its competitors to provide data and analytics on page views, users, and content. Web admins can then use these reports to plan their content to improve their referral traffic, search visibility, and organic traffic.
- Copyright Bots
Copyright bots crawl the internet, scanning for copyrighted images to ensure no one is illegally using copyrighted content without permission.
While the above examples are undoubtedly cases of harmful bot traffic, what are some instances of good bot traffic? The following bots are legitimate and are there to provide helpful solutions for websites and applications.
Bot traffic can effectively destroy businesses if they don't learn how to identify and manage bot traffic. Sites that rely on advertising alongside sites that sell products and merchandise with limited inventory are particularly vulnerable.
For sites that are running ads, bots that land on the site and click on various page elements can trigger fake ad clicks. This is known as click fraud, and while it may initially increase ad revenue, once online advertising networks detect the fraud, it will usually result in the site and the owner getting banned from their network.
For eCommerce sites with limited inventory, inventory hoarding bots can virtually shut down their shop by filling carts with tons of merchandise, making it unavailable for purchase by genuine shoppers.
As we move into an increasingly tech-driven future, search engine crawler bots are getting smarter by the day. A report released by Imperva in 2020 found that bots comprised almost 40% of Internet traffic, out of which bad bots were the most significant offenders.
Web publishers and designers can identify bot traffic by examining the network requests to their sites. Utilizing an integrated analytics tool such as Google Analytics will further help website owners identify traffic bots in their website traffic. The hallmarks of bot traffic include the following characteristics:
- Abnormally High
- Abnormally High Bounce
- Abnormally High Pageviews
- Abnormally High Bounce Rate
- Junk Conversions
- Spike in Traffic From an Unexpected Location
6. How to Stop Bot Traffic
Once a company or agency has learned how to identify bot traffic, it is imperative that they gain the knowledge and tools needed to stop bot traffic negatively affecting their site.
The following tools will help minimize threats:
- Legitimate Arbitrage
Traffic Arbitrage is the practice of paying to bring traffic to a website to ensure high-yielding PPC/CPM-based campaigns. By only purchasing traffic from known sources, site owners can reduce the risk of bad bot traffic.
- Use Robots.txt
Placing a robots.txt file will assist in keeping bad bots away from a site.
- JavaScript for Alerts
Site owners can place a contextual JavaScript (JS) to alert them whenever a bot appears to enter the website.
- DDOS Lists
Publishers can compile a list of offensive IP addresses and deny those visit requests on their website, thereby reducing the number of DDoS attacks.
- Use Type-Challenge
Response Tests: One of the simplest and most common ways to detect bot traffic is to utilize CAPTCHA on the sign-up or download form. This is particularly useful in stopping download and spambots.
- Scrutinize Log Files

For web admins who have a sophisticated understanding of data and analytics, examining server error log files can help find and fix website errors caused by bots.
7. How to Detect Bot Traffic in Google Analytics
For publishers using Google Analytics, there are some simple ways to set up your site to filter out bot traffic.
- Firstly, visit the Google Analytics Admin Panel.
- Next, Navigate to View Settings in the View tab.
- Scroll down to the Bot Filtering checkbox.
- Click Check in the checkbox if unchecked.
- Finally, hit Save.
8. Why Is It Important to Protect Your Ads?
Any website that is running Pay Per Click ads will at some point be hit by bot traffic of one form or another. It is imperative that publishers take steps to protect their ads, or bot traffic will eventually cause the following issues:
- Website data and analytics may become skewed
- Website load time and performance may begin to deteriorate
- Websites become vulnerable to botnets, DDOS attacks, and ultimately negative SEO results
- CPC is negatively affected, and ultimately revenue may be lost
9. Are you a digital publisher who needs help monitoring bots?
At Netlink, we pair simplified, cutting-edge programmatic advertising technology with impartial guidance to help our clients understand the ad tech landscape and get the most out of the ads on their websites. Contact our friendly team to learn more today via: sale@appmatic.sg.