Get Out of My Stats, Out of My Heart: A Struggle Between the Eternal Optimist and Bad Bots

Eternal Optimism

A month or two of steady traffic increases, followed by a couple distinct spikes in traffic recently had me stats-smitten. Unfortunately here on the Internet, if it seems too good to be true, it probably is.

Allow me to set the stage:
It was mid-autumn; a Monday. My heart fluttered as I scribbled weekly analytics numbers across my white board, grinning as if I’d just won the Internet. You see, I’d been watching the patterns for some time and taking note of the steady traffic increases. With the ever-changing state of the industry, I’d promised myself I wouldn’t get anxious just yet—I’d keep my excitement buried until the time was right. It was this particular Monday I knew the time had arrived; this was the day that the song of my heart was heard.

I got to work immediately putting together an analysis that would bring to light the product of perseverance against all odds—something we’d done SO RIGHT that site traffic had nearly doubled over the past two weeks. It was finally time to shine.

It was about that time that my heart fell silent.
Now, I wish I could say that over the past months, I’d been clever enough, conscientious enough to see what had been staring me in the face all along. We weren’t seeing this traffic spike because suddenly more people were interested in what we had to say and we also hadn’t successfully managed to engage many more people from social than before. Our site speed adjustments hadn’t thrown us a win and optimization edits hadn’t suddenly taken flight. We hadn’t accomplished a thing.

As a matter of fact, with this larger traffic spike, our bounce rate was out of control, averaging 20% higher than usual, and many other stats looked pretty skewed, too. After some research, what I dreaded was confirmed: we had a series of unwelcome guests, and they weren’t even human.

BAD BOTS, BAD!
Generally, there’s no need to filter bots from website traffic, whether good, bad, or otherwise. Most bots are intelligent enough to recognize and bypass client-side tracking scripts . . . except apparently, Microsoft’s bots. These are the bots that parade about, sometimes inflating traffic by a few hundred visits in a day (not to mention swelling my stats-loving heart).

While I can’t say I’ve managed to escape the initial delight of a huge traffic increase, the good news is that there’s hope for the heart of this eternal optimist.

Scrub the stank out of your stats.
As covered extensively by LunaMetrics, you’ve got a few options available to clean up your stats, but my favorite so far has been this advanced segment, which can be applied in your Google Analytics to get a more accurate output (note: you have to be logged in to add it!). Once it’s been added, you’ll find it in your account here:

exclude-bots-adv-seg

Compare it to your “All Traffic” segment and see just how much of a difference it makes!

Not sure if you have a problem?
Visit Audience -> Technology -> Browser & OS in Google Analytics. Look for Mozilla Compatible Agent, one of the main culprits.

Questions? Let us know in the comments below!