Tracking Content Performance Despite Bot Traffic Sabotage
You may want to take a seat for this one because it might come as a shock: those visitors and hit statistics your analytics services provide you with are most likely built on a house of lies. According to the security specialists at Incapsula, a medium-sized site that attracts 10,000 visitors gets a staggering 63.2 percent of its traffic from Internet bots. Internet bots are computer programs that perform repetitive tasks visiting websites. Some bots are good programs like the ones search engines use to index websites and provide useful search results. There are also malicious bots out there looking for ways to spam your site, scrape your content for re-purposing, and look for security holes. The worst bots try to sabotage your website. All this bots, both good and bad, cause quite a bit of havoc when it comes to tracking web traffic. According to the Incapsula, computer programs known as bots now account for upwards of 56 percent of all web traffic. It gets worse for sites that bring in one thousand hits a day: 80.3 percent of small-size traffic comes from bots.
If you hire a writer to create content for you, there’s the expectation that it will be read by actual people and not mostly robots. Unfortunately, bots make up the majority of web traffic. The higher traffic numbers those bots are contributing might be better to pitch for advertisement sales, but the reality is they’re not giving you an accurate picture of reality. Reality is important in web traffic reporting because you need to know how many people are actually viewing your site to know if it’s doing well. Around 29 percent of all traffic comes from bad bots whereas 27 percent of traffic comes from good bots. When you think about it, that means a good bot from a search engine scans a page once for every two hits.
The Two-Step Plan
Advanced Bot Blocking
While tedious, there are a few things your Web IT staff can take care of to further rule out bots like configuring the server to block IP addresses of established bots. This also prevents you from wasting server resources on bad bots. However, identifying individual IP addresses and domains associated with bad bots is a tedious and often ineffective process. Bad bot-related IP addresses can be identified by generating an incredible amount of hits in your tracking tools. The numbers are usually pretty obvious as the bots may be exponentially higher than any other visitors. The SEO specialists at Moz recommend blocking, darodar.com, econom.co, ilovevitaly.co, semalt.com, buttons-for-website.com, and see-your-website-here.com for starters.
While getting rid of all bot traffic from your web analytics tracking is an endless, and potentially futile, endeavor, simply switching your tracking to Google Analytics and enabling the bot filter will take care of most of the problem.
Dan S is a former news journalist turned web developer and freelance writer. He has a penchant for all things tech and believes the person using the machine is the most important element.